Mar 19 11:51:54.193527 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 11:51:54.973373 master-0 kubenswrapper[4036]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:51:54.973373 master-0 kubenswrapper[4036]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 11:51:54.973373 master-0 kubenswrapper[4036]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:51:54.973373 master-0 kubenswrapper[4036]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:51:54.973373 master-0 kubenswrapper[4036]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 11:51:54.973373 master-0 kubenswrapper[4036]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:51:54.978668 master-0 kubenswrapper[4036]: I0319 11:51:54.978428 4036 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995221 4036 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995247 4036 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995252 4036 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995257 4036 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995262 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995267 4036 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995273 4036 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995277 4036 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995282 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995286 4036 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995290 4036 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995294 4036 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995298 4036 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995302 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995305 4036 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995309 4036 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995314 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995318 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995327 4036 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:51:54.995330 master-0 kubenswrapper[4036]: W0319 11:51:54.995330 4036 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995334 4036 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995338 4036 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995342 4036 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995345 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995349 4036 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995353 4036 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995357 4036 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995362 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995366 4036 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995370 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995376 4036 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995381 4036 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995386 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995391 4036 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995407 4036 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995412 4036 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995417 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995421 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995426 4036 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:51:54.996385 master-0 kubenswrapper[4036]: W0319 11:51:54.995430 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995434 4036 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995439 4036 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995444 4036 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995449 4036 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995454 4036 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995459 4036 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995464 4036 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995469 4036 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995474 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995479 4036 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995486 4036 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995494 4036 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995499 4036 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995506 4036 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995510 4036 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995514 4036 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995518 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995522 4036 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995526 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:51:54.997119 master-0 kubenswrapper[4036]: W0319 11:51:54.995530 4036 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995534 4036 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995538 4036 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995542 4036 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995546 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995550 4036 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995554 4036 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995557 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995561 4036 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995573 4036 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995578 4036 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995583 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: W0319 11:51:54.995587 4036 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996800 4036 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996817 4036 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996825 4036 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996831 4036 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996838 4036 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996842 4036 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996850 4036 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 11:51:54.997884 master-0 kubenswrapper[4036]: I0319 11:51:54.996856 4036 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996862 4036 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996867 4036 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996872 4036 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996878 4036 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996882 4036 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996890 4036 flags.go:64] FLAG: --cgroup-root="" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996926 4036 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996932 4036 flags.go:64] FLAG: --client-ca-file="" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996937 4036 flags.go:64] FLAG: --cloud-config="" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996941 4036 flags.go:64] FLAG: --cloud-provider="" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996946 4036 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996953 4036 flags.go:64] FLAG: --cluster-domain="" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996957 4036 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996961 4036 flags.go:64] FLAG: --config-dir="" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996967 4036 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996973 4036 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996985 4036 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996990 4036 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996995 4036 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.996999 4036 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.997004 4036 flags.go:64] FLAG: --contention-profiling="false" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.997009 4036 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.997014 4036 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.997018 4036 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 11:51:54.998496 master-0 kubenswrapper[4036]: I0319 11:51:54.997023 4036 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997029 4036 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997034 4036 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997038 4036 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997042 4036 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997047 4036 flags.go:64] FLAG: --enable-server="true" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997051 4036 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997057 4036 flags.go:64] FLAG: --event-burst="100" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997062 4036 flags.go:64] FLAG: --event-qps="50" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997066 4036 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997072 4036 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997077 4036 flags.go:64] FLAG: --eviction-hard="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997083 4036 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997088 4036 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997093 4036 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997098 4036 flags.go:64] FLAG: --eviction-soft="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997102 4036 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997107 4036 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997112 4036 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997116 4036 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997121 4036 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997126 4036 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997131 4036 flags.go:64] FLAG: --feature-gates="" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997138 4036 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997143 4036 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 11:51:54.999277 master-0 kubenswrapper[4036]: I0319 11:51:54.997148 4036 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997153 4036 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997159 4036 flags.go:64] FLAG: --healthz-port="10248" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997164 4036 flags.go:64] FLAG: --help="false" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997169 4036 flags.go:64] FLAG: --hostname-override="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997174 4036 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997178 4036 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997183 4036 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997187 4036 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997191 4036 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997196 4036 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997200 4036 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997205 4036 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997209 4036 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997213 4036 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997218 4036 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997223 4036 flags.go:64] FLAG: --kube-reserved="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997227 4036 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997232 4036 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997236 4036 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997241 4036 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997247 4036 flags.go:64] FLAG: --lock-file="" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997251 4036 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997256 4036 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997260 4036 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997267 4036 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 11:51:55.000178 master-0 kubenswrapper[4036]: I0319 11:51:54.997272 4036 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997276 4036 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997281 4036 flags.go:64] FLAG: --logging-format="text" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997285 4036 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997290 4036 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997295 4036 flags.go:64] FLAG: --manifest-url="" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997299 4036 flags.go:64] FLAG: --manifest-url-header="" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997305 4036 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997310 4036 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997315 4036 flags.go:64] FLAG: --max-pods="110" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997320 4036 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997325 4036 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997329 4036 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997334 4036 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997338 4036 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997343 4036 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997347 4036 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997359 4036 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997363 4036 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997368 4036 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997372 4036 flags.go:64] FLAG: --pod-cidr="" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997376 4036 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997385 4036 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 11:51:55.001124 master-0 kubenswrapper[4036]: I0319 11:51:54.997389 4036 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997393 4036 flags.go:64] FLAG: --pods-per-core="0" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997398 4036 flags.go:64] FLAG: --port="10250" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997403 4036 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997408 4036 flags.go:64] FLAG: --provider-id="" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997413 4036 flags.go:64] FLAG: --qos-reserved="" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997417 4036 flags.go:64] FLAG: --read-only-port="10255" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997421 4036 flags.go:64] FLAG: --register-node="true" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997429 4036 flags.go:64] FLAG: --register-schedulable="true" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997434 4036 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997442 4036 flags.go:64] FLAG: --registry-burst="10" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997446 4036 flags.go:64] FLAG: --registry-qps="5" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997451 4036 flags.go:64] FLAG: --reserved-cpus="" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997455 4036 flags.go:64] FLAG: --reserved-memory="" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997461 4036 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997466 4036 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997471 4036 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997475 4036 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997480 4036 flags.go:64] FLAG: --runonce="false" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997484 4036 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997489 4036 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997494 4036 flags.go:64] FLAG: --seccomp-default="false" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997498 4036 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997503 4036 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997507 4036 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997511 4036 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 11:51:55.001965 master-0 kubenswrapper[4036]: I0319 11:51:54.997516 4036 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997521 4036 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997527 4036 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997531 4036 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997536 4036 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997541 4036 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997546 4036 flags.go:64] FLAG: --system-cgroups="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997550 4036 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997558 4036 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997562 4036 flags.go:64] FLAG: --tls-cert-file="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997567 4036 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997574 4036 flags.go:64] FLAG: --tls-min-version="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997578 4036 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997583 4036 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997587 4036 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997592 4036 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997597 4036 flags.go:64] FLAG: --v="2" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997604 4036 flags.go:64] FLAG: --version="false" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997610 4036 flags.go:64] FLAG: --vmodule="" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997616 4036 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: I0319 11:51:54.997622 4036 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: W0319 11:51:54.997753 4036 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: W0319 11:51:54.997760 4036 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: W0319 11:51:54.997764 4036 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:51:55.002788 master-0 kubenswrapper[4036]: W0319 11:51:54.997768 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997772 4036 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997778 4036 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997783 4036 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997787 4036 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997792 4036 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997795 4036 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997799 4036 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997804 4036 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997809 4036 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997814 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997818 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997823 4036 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997827 4036 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997832 4036 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997836 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997841 4036 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997845 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:51:55.003722 master-0 kubenswrapper[4036]: W0319 11:51:54.997849 4036 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997853 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997857 4036 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997861 4036 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997865 4036 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997870 4036 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997874 4036 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997878 4036 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997882 4036 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997887 4036 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997893 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997897 4036 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997904 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997909 4036 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997914 4036 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997919 4036 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997925 4036 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997929 4036 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997940 4036 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997943 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:51:55.004343 master-0 kubenswrapper[4036]: W0319 11:51:54.997947 4036 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997951 4036 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997957 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997961 4036 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997964 4036 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997968 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997972 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997976 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997979 4036 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997983 4036 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997987 4036 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997990 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.997995 4036 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998000 4036 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998004 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998007 4036 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998011 4036 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998015 4036 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998018 4036 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998022 4036 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:51:55.005140 master-0 kubenswrapper[4036]: W0319 11:51:54.998026 4036 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998029 4036 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998033 4036 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998037 4036 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998041 4036 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998045 4036 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998048 4036 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998052 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998058 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998061 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: W0319 11:51:54.998068 4036 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:51:55.005797 master-0 kubenswrapper[4036]: I0319 11:51:54.998082 4036 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:51:55.006620 master-0 kubenswrapper[4036]: I0319 11:51:55.006557 4036 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 11:51:55.006620 master-0 kubenswrapper[4036]: I0319 11:51:55.006610 4036 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 11:51:55.006755 master-0 kubenswrapper[4036]: W0319 11:51:55.006724 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:51:55.006755 master-0 kubenswrapper[4036]: W0319 11:51:55.006739 4036 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:51:55.006755 master-0 kubenswrapper[4036]: W0319 11:51:55.006744 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:51:55.006755 master-0 kubenswrapper[4036]: W0319 11:51:55.006750 4036 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:51:55.006755 master-0 kubenswrapper[4036]: W0319 11:51:55.006755 4036 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:51:55.006755 master-0 kubenswrapper[4036]: W0319 11:51:55.006759 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006766 4036 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006773 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006778 4036 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006783 4036 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006787 4036 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006792 4036 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006796 4036 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006800 4036 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006805 4036 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006811 4036 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006817 4036 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006823 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006828 4036 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006834 4036 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006838 4036 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006843 4036 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006847 4036 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006852 4036 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006857 4036 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:51:55.006961 master-0 kubenswrapper[4036]: W0319 11:51:55.006863 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006868 4036 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006872 4036 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006877 4036 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006882 4036 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006887 4036 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006892 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006897 4036 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006902 4036 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006909 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006914 4036 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006919 4036 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006924 4036 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006930 4036 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006937 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006943 4036 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006948 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006954 4036 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006959 4036 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006965 4036 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:51:55.007557 master-0 kubenswrapper[4036]: W0319 11:51:55.006971 4036 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.006978 4036 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.006984 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.006989 4036 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.006995 4036 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007000 4036 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007005 4036 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007010 4036 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007015 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007020 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007025 4036 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007030 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007035 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007040 4036 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007047 4036 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007057 4036 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007064 4036 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007069 4036 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007074 4036 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:51:55.008288 master-0 kubenswrapper[4036]: W0319 11:51:55.007080 4036 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007085 4036 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007090 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007095 4036 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007100 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007105 4036 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007112 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007118 4036 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: I0319 11:51:55.007126 4036 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007285 4036 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007297 4036 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007303 4036 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007308 4036 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007313 4036 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007320 4036 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:51:55.008962 master-0 kubenswrapper[4036]: W0319 11:51:55.007325 4036 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007330 4036 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007335 4036 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007339 4036 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007344 4036 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007349 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007354 4036 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007359 4036 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007364 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007368 4036 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007373 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007379 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007384 4036 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007388 4036 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007393 4036 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007399 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007405 4036 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007412 4036 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007417 4036 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:51:55.009468 master-0 kubenswrapper[4036]: W0319 11:51:55.007421 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007426 4036 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007430 4036 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007434 4036 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007439 4036 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007443 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007449 4036 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007454 4036 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007460 4036 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007465 4036 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007470 4036 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007474 4036 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007481 4036 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007486 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007491 4036 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007496 4036 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007501 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007506 4036 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007511 4036 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007515 4036 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:51:55.010355 master-0 kubenswrapper[4036]: W0319 11:51:55.007522 4036 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007527 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007532 4036 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007537 4036 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007542 4036 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007547 4036 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007551 4036 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007556 4036 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007561 4036 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007566 4036 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007573 4036 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007579 4036 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007584 4036 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007589 4036 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007594 4036 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007601 4036 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007607 4036 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007613 4036 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007618 4036 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:51:55.011105 master-0 kubenswrapper[4036]: W0319 11:51:55.007624 4036 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007629 4036 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007650 4036 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007655 4036 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007660 4036 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007667 4036 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007672 4036 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: W0319 11:51:55.007677 4036 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: I0319 11:51:55.007687 4036 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:51:55.011677 master-0 kubenswrapper[4036]: I0319 11:51:55.007954 4036 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 11:51:55.012319 master-0 kubenswrapper[4036]: I0319 11:51:55.012273 4036 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 11:51:55.013910 master-0 kubenswrapper[4036]: I0319 11:51:55.013875 4036 server.go:997] "Starting client certificate rotation" Mar 19 11:51:55.013910 master-0 kubenswrapper[4036]: I0319 11:51:55.013901 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 11:51:55.014150 master-0 kubenswrapper[4036]: I0319 11:51:55.014085 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:51:55.044533 master-0 kubenswrapper[4036]: I0319 11:51:55.044449 4036 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:51:55.048107 master-0 kubenswrapper[4036]: I0319 11:51:55.048017 4036 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:51:55.050819 master-0 kubenswrapper[4036]: E0319 11:51:55.050751 4036 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:55.079837 master-0 kubenswrapper[4036]: I0319 11:51:55.079735 4036 log.go:25] "Validated CRI v1 runtime API" Mar 19 11:51:55.085611 master-0 kubenswrapper[4036]: I0319 11:51:55.085561 4036 log.go:25] "Validated CRI v1 image API" Mar 19 11:51:55.088557 master-0 kubenswrapper[4036]: I0319 11:51:55.088503 4036 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 11:51:55.093535 master-0 kubenswrapper[4036]: I0319 11:51:55.093468 4036 fs.go:135] Filesystem UUIDs: map[2be6375c-780c-4b64-a018-3b2c170f9318:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 19 11:51:55.093535 master-0 kubenswrapper[4036]: I0319 11:51:55.093520 4036 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 19 11:51:55.115223 master-0 kubenswrapper[4036]: I0319 11:51:55.114795 4036 manager.go:217] Machine: {Timestamp:2026-03-19 11:51:55.111453092 +0000 UTC m=+0.663873717 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:d9abaffc4eed451b8d7e70f7431f4a5e SystemUUID:d9abaffc-4eed-451b-8d7e-70f7431f4a5e BootID:35523968-247e-452c-b4d2-5f84e7547d23 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:a8:4b:cf Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:ea:ac:db Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:4a:5a:35:ca:89:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 11:51:55.115223 master-0 kubenswrapper[4036]: I0319 11:51:55.115140 4036 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 11:51:55.115425 master-0 kubenswrapper[4036]: I0319 11:51:55.115333 4036 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 11:51:55.116866 master-0 kubenswrapper[4036]: I0319 11:51:55.116764 4036 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 11:51:55.117193 master-0 kubenswrapper[4036]: I0319 11:51:55.117107 4036 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 11:51:55.117511 master-0 kubenswrapper[4036]: I0319 11:51:55.117166 4036 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 11:51:55.117511 master-0 kubenswrapper[4036]: I0319 11:51:55.117512 4036 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 11:51:55.117750 master-0 kubenswrapper[4036]: I0319 11:51:55.117531 4036 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 11:51:55.117750 master-0 kubenswrapper[4036]: I0319 11:51:55.117555 4036 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:51:55.118833 master-0 kubenswrapper[4036]: I0319 11:51:55.118774 4036 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:51:55.118984 master-0 kubenswrapper[4036]: I0319 11:51:55.118956 4036 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:51:55.119094 master-0 kubenswrapper[4036]: I0319 11:51:55.119070 4036 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 11:51:55.125020 master-0 kubenswrapper[4036]: I0319 11:51:55.124967 4036 kubelet.go:418] "Attempting to sync node with API server" Mar 19 11:51:55.125020 master-0 kubenswrapper[4036]: I0319 11:51:55.125004 4036 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 11:51:55.125184 master-0 kubenswrapper[4036]: I0319 11:51:55.125067 4036 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 11:51:55.125184 master-0 kubenswrapper[4036]: I0319 11:51:55.125096 4036 kubelet.go:324] "Adding apiserver pod source" Mar 19 11:51:55.125184 master-0 kubenswrapper[4036]: I0319 11:51:55.125115 4036 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 11:51:55.130958 master-0 kubenswrapper[4036]: W0319 11:51:55.130866 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:55.131055 master-0 kubenswrapper[4036]: E0319 11:51:55.130988 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:55.131055 master-0 kubenswrapper[4036]: W0319 11:51:55.130971 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:55.131197 master-0 kubenswrapper[4036]: E0319 11:51:55.131065 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:55.133572 master-0 kubenswrapper[4036]: I0319 11:51:55.133513 4036 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 11:51:55.136452 master-0 kubenswrapper[4036]: I0319 11:51:55.136399 4036 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 11:51:55.136735 master-0 kubenswrapper[4036]: I0319 11:51:55.136694 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 11:51:55.136735 master-0 kubenswrapper[4036]: I0319 11:51:55.136733 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136741 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136760 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136768 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136780 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136791 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136797 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136845 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136862 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136878 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 11:51:55.136917 master-0 kubenswrapper[4036]: I0319 11:51:55.136921 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 11:51:55.138606 master-0 kubenswrapper[4036]: I0319 11:51:55.138561 4036 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 11:51:55.139232 master-0 kubenswrapper[4036]: I0319 11:51:55.139191 4036 server.go:1280] "Started kubelet" Mar 19 11:51:55.141798 master-0 kubenswrapper[4036]: I0319 11:51:55.141561 4036 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 11:51:55.141957 master-0 kubenswrapper[4036]: I0319 11:51:55.141660 4036 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 11:51:55.141957 master-0 kubenswrapper[4036]: I0319 11:51:55.141902 4036 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 11:51:55.144919 master-0 kubenswrapper[4036]: I0319 11:51:55.142665 4036 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 11:51:55.144919 master-0 kubenswrapper[4036]: I0319 11:51:55.143670 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:55.143496 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 11:51:55.151164 master-0 kubenswrapper[4036]: I0319 11:51:55.151108 4036 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 11:51:55.151349 master-0 kubenswrapper[4036]: I0319 11:51:55.151271 4036 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 11:51:55.154981 master-0 kubenswrapper[4036]: I0319 11:51:55.154911 4036 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 11:51:55.154981 master-0 kubenswrapper[4036]: I0319 11:51:55.154953 4036 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 11:51:55.154981 master-0 kubenswrapper[4036]: E0319 11:51:55.154954 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:55.155311 master-0 kubenswrapper[4036]: I0319 11:51:55.155181 4036 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 11:51:55.155554 master-0 kubenswrapper[4036]: I0319 11:51:55.155512 4036 reconstruct.go:97] "Volume reconstruction finished" Mar 19 11:51:55.155554 master-0 kubenswrapper[4036]: I0319 11:51:55.155539 4036 reconciler.go:26] "Reconciler: start to sync state" Mar 19 11:51:55.156543 master-0 kubenswrapper[4036]: E0319 11:51:55.156258 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 11:51:55.157142 master-0 kubenswrapper[4036]: I0319 11:51:55.157090 4036 server.go:449] "Adding debug handlers to kubelet server" Mar 19 11:51:55.157262 master-0 kubenswrapper[4036]: I0319 11:51:55.157227 4036 factory.go:55] Registering systemd factory Mar 19 11:51:55.157262 master-0 kubenswrapper[4036]: I0319 11:51:55.157254 4036 factory.go:221] Registration of the systemd container factory successfully Mar 19 11:51:55.157679 master-0 kubenswrapper[4036]: W0319 11:51:55.157570 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:55.157776 master-0 kubenswrapper[4036]: E0319 11:51:55.157685 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:55.157879 master-0 kubenswrapper[4036]: E0319 11:51:55.156183 4036 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e3bdb076f1682 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.13915149 +0000 UTC m=+0.691572105,LastTimestamp:2026-03-19 11:51:55.13915149 +0000 UTC m=+0.691572105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:55.157975 master-0 kubenswrapper[4036]: I0319 11:51:55.157916 4036 factory.go:153] Registering CRI-O factory Mar 19 11:51:55.157975 master-0 kubenswrapper[4036]: I0319 11:51:55.157966 4036 factory.go:221] Registration of the crio container factory successfully Mar 19 11:51:55.158174 master-0 kubenswrapper[4036]: I0319 11:51:55.158124 4036 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 11:51:55.158241 master-0 kubenswrapper[4036]: I0319 11:51:55.158173 4036 factory.go:103] Registering Raw factory Mar 19 11:51:55.158241 master-0 kubenswrapper[4036]: I0319 11:51:55.158197 4036 manager.go:1196] Started watching for new ooms in manager Mar 19 11:51:55.158973 master-0 kubenswrapper[4036]: I0319 11:51:55.158932 4036 manager.go:319] Starting recovery of all containers Mar 19 11:51:55.162780 master-0 kubenswrapper[4036]: E0319 11:51:55.162394 4036 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 11:51:55.180522 master-0 kubenswrapper[4036]: I0319 11:51:55.180184 4036 manager.go:324] Recovery completed Mar 19 11:51:55.191106 master-0 kubenswrapper[4036]: I0319 11:51:55.191014 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.194122 master-0 kubenswrapper[4036]: I0319 11:51:55.194070 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.194231 master-0 kubenswrapper[4036]: I0319 11:51:55.194132 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.194231 master-0 kubenswrapper[4036]: I0319 11:51:55.194152 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.195164 master-0 kubenswrapper[4036]: I0319 11:51:55.195127 4036 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 11:51:55.195164 master-0 kubenswrapper[4036]: I0319 11:51:55.195162 4036 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 11:51:55.195329 master-0 kubenswrapper[4036]: I0319 11:51:55.195203 4036 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:51:55.199560 master-0 kubenswrapper[4036]: I0319 11:51:55.199510 4036 policy_none.go:49] "None policy: Start" Mar 19 11:51:55.200922 master-0 kubenswrapper[4036]: I0319 11:51:55.200842 4036 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 11:51:55.200922 master-0 kubenswrapper[4036]: I0319 11:51:55.200905 4036 state_mem.go:35] "Initializing new in-memory state store" Mar 19 11:51:55.256299 master-0 kubenswrapper[4036]: E0319 11:51:55.256053 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:51:55.292201 master-0 kubenswrapper[4036]: I0319 11:51:55.292134 4036 manager.go:334] "Starting Device Plugin manager" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.292223 4036 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.292247 4036 server.go:79] "Starting device plugin registration server" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.293101 4036 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.293136 4036 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.293445 4036 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.293754 4036 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.293776 4036 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: E0319 11:51:55.298057 4036 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.307787 4036 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.311151 4036 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.311229 4036 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: I0319 11:51:55.311270 4036 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: E0319 11:51:55.311358 4036 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: W0319 11:51:55.312512 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:55.314200 master-0 kubenswrapper[4036]: E0319 11:51:55.312613 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:55.357479 master-0 kubenswrapper[4036]: E0319 11:51:55.357370 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 11:51:55.393732 master-0 kubenswrapper[4036]: I0319 11:51:55.393556 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.395042 master-0 kubenswrapper[4036]: I0319 11:51:55.394969 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.395108 master-0 kubenswrapper[4036]: I0319 11:51:55.395077 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.395108 master-0 kubenswrapper[4036]: I0319 11:51:55.395097 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.395186 master-0 kubenswrapper[4036]: I0319 11:51:55.395160 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:55.396385 master-0 kubenswrapper[4036]: E0319 11:51:55.396339 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:51:55.411737 master-0 kubenswrapper[4036]: I0319 11:51:55.411670 4036 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:51:55.411925 master-0 kubenswrapper[4036]: I0319 11:51:55.411789 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.413507 master-0 kubenswrapper[4036]: I0319 11:51:55.413478 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.413618 master-0 kubenswrapper[4036]: I0319 11:51:55.413607 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.413716 master-0 kubenswrapper[4036]: I0319 11:51:55.413705 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.413973 master-0 kubenswrapper[4036]: I0319 11:51:55.413957 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.414275 master-0 kubenswrapper[4036]: I0319 11:51:55.414236 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.414320 master-0 kubenswrapper[4036]: I0319 11:51:55.414306 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.415885 master-0 kubenswrapper[4036]: I0319 11:51:55.415862 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.415885 master-0 kubenswrapper[4036]: I0319 11:51:55.415888 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.415971 master-0 kubenswrapper[4036]: I0319 11:51:55.415898 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.415971 master-0 kubenswrapper[4036]: I0319 11:51:55.415934 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.415971 master-0 kubenswrapper[4036]: I0319 11:51:55.415954 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.416049 master-0 kubenswrapper[4036]: I0319 11:51:55.415989 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.416049 master-0 kubenswrapper[4036]: I0319 11:51:55.416021 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.416514 master-0 kubenswrapper[4036]: I0319 11:51:55.416473 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.416514 master-0 kubenswrapper[4036]: I0319 11:51:55.416501 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.417081 master-0 kubenswrapper[4036]: I0319 11:51:55.417046 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.417081 master-0 kubenswrapper[4036]: I0319 11:51:55.417069 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.417339 master-0 kubenswrapper[4036]: I0319 11:51:55.417094 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.417339 master-0 kubenswrapper[4036]: I0319 11:51:55.417108 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.417339 master-0 kubenswrapper[4036]: I0319 11:51:55.417075 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.417339 master-0 kubenswrapper[4036]: I0319 11:51:55.417169 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.417339 master-0 kubenswrapper[4036]: I0319 11:51:55.417199 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.417471 master-0 kubenswrapper[4036]: I0319 11:51:55.417341 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.417471 master-0 kubenswrapper[4036]: I0319 11:51:55.417389 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.417791 master-0 kubenswrapper[4036]: I0319 11:51:55.417759 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.417791 master-0 kubenswrapper[4036]: I0319 11:51:55.417784 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.417791 master-0 kubenswrapper[4036]: I0319 11:51:55.417794 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.417912 master-0 kubenswrapper[4036]: I0319 11:51:55.417884 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.418112 master-0 kubenswrapper[4036]: I0319 11:51:55.418076 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.418164 master-0 kubenswrapper[4036]: I0319 11:51:55.418126 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.418476 master-0 kubenswrapper[4036]: I0319 11:51:55.418447 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.418531 master-0 kubenswrapper[4036]: I0319 11:51:55.418480 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.418531 master-0 kubenswrapper[4036]: I0319 11:51:55.418493 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.418660 master-0 kubenswrapper[4036]: I0319 11:51:55.418618 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.418711 master-0 kubenswrapper[4036]: I0319 11:51:55.418664 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.418711 master-0 kubenswrapper[4036]: I0319 11:51:55.418676 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.418834 master-0 kubenswrapper[4036]: I0319 11:51:55.418807 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.418884 master-0 kubenswrapper[4036]: I0319 11:51:55.418839 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.418884 master-0 kubenswrapper[4036]: I0319 11:51:55.418853 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.418941 master-0 kubenswrapper[4036]: I0319 11:51:55.418821 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.418941 master-0 kubenswrapper[4036]: I0319 11:51:55.418937 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.419542 master-0 kubenswrapper[4036]: I0319 11:51:55.419518 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.419593 master-0 kubenswrapper[4036]: I0319 11:51:55.419557 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.419593 master-0 kubenswrapper[4036]: I0319 11:51:55.419572 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.456703 master-0 kubenswrapper[4036]: I0319 11:51:55.456567 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.456703 master-0 kubenswrapper[4036]: I0319 11:51:55.456662 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.456857 master-0 kubenswrapper[4036]: I0319 11:51:55.456707 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.456857 master-0 kubenswrapper[4036]: I0319 11:51:55.456733 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.456857 master-0 kubenswrapper[4036]: I0319 11:51:55.456756 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.456857 master-0 kubenswrapper[4036]: I0319 11:51:55.456791 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.456857 master-0 kubenswrapper[4036]: I0319 11:51:55.456830 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.456857 master-0 kubenswrapper[4036]: I0319 11:51:55.456853 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.456876 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.456903 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.456970 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.457008 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.457040 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.457058 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.457083 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.457102 master-0 kubenswrapper[4036]: I0319 11:51:55.457101 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.457367 master-0 kubenswrapper[4036]: I0319 11:51:55.457118 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.558540 master-0 kubenswrapper[4036]: I0319 11:51:55.558435 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.558540 master-0 kubenswrapper[4036]: I0319 11:51:55.558528 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.558583 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.559365 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.559420 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.559445 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.559513 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.559562 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.559665 master-0 kubenswrapper[4036]: I0319 11:51:55.559582 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559698 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559782 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559772 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559852 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559831 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559930 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.559969 master-0 kubenswrapper[4036]: I0319 11:51:55.559966 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.560223 master-0 kubenswrapper[4036]: I0319 11:51:55.559998 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.560223 master-0 kubenswrapper[4036]: I0319 11:51:55.560025 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560223 master-0 kubenswrapper[4036]: I0319 11:51:55.560063 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560223 master-0 kubenswrapper[4036]: I0319 11:51:55.560127 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.560223 master-0 kubenswrapper[4036]: I0319 11:51:55.560135 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.560223 master-0 kubenswrapper[4036]: I0319 11:51:55.560161 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560463 master-0 kubenswrapper[4036]: I0319 11:51:55.560259 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.560463 master-0 kubenswrapper[4036]: I0319 11:51:55.560291 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.560463 master-0 kubenswrapper[4036]: I0319 11:51:55.560319 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.560463 master-0 kubenswrapper[4036]: I0319 11:51:55.560346 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.560463 master-0 kubenswrapper[4036]: I0319 11:51:55.560380 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560719 master-0 kubenswrapper[4036]: I0319 11:51:55.560375 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.560719 master-0 kubenswrapper[4036]: I0319 11:51:55.560539 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.560719 master-0 kubenswrapper[4036]: I0319 11:51:55.560613 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560719 master-0 kubenswrapper[4036]: I0319 11:51:55.560586 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560719 master-0 kubenswrapper[4036]: I0319 11:51:55.560685 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.560898 master-0 kubenswrapper[4036]: I0319 11:51:55.560774 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.560898 master-0 kubenswrapper[4036]: I0319 11:51:55.560774 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:55.596775 master-0 kubenswrapper[4036]: I0319 11:51:55.596699 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:55.598342 master-0 kubenswrapper[4036]: I0319 11:51:55.598315 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:55.598554 master-0 kubenswrapper[4036]: I0319 11:51:55.598516 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:55.598554 master-0 kubenswrapper[4036]: I0319 11:51:55.598546 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:55.598709 master-0 kubenswrapper[4036]: I0319 11:51:55.598614 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:55.599599 master-0 kubenswrapper[4036]: E0319 11:51:55.599556 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:51:55.750868 master-0 kubenswrapper[4036]: I0319 11:51:55.750732 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:51:55.759145 master-0 kubenswrapper[4036]: E0319 11:51:55.759058 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 11:51:55.762809 master-0 kubenswrapper[4036]: I0319 11:51:55.762354 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:51:55.786127 master-0 kubenswrapper[4036]: I0319 11:51:55.786087 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:51:55.804903 master-0 kubenswrapper[4036]: I0319 11:51:55.804779 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:51:55.813442 master-0 kubenswrapper[4036]: I0319 11:51:55.813391 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:51:56.000696 master-0 kubenswrapper[4036]: I0319 11:51:56.000290 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:56.002395 master-0 kubenswrapper[4036]: I0319 11:51:56.002366 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:56.002457 master-0 kubenswrapper[4036]: I0319 11:51:56.002419 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:56.002457 master-0 kubenswrapper[4036]: I0319 11:51:56.002439 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:56.002526 master-0 kubenswrapper[4036]: I0319 11:51:56.002513 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:56.003788 master-0 kubenswrapper[4036]: E0319 11:51:56.003736 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:51:56.069534 master-0 kubenswrapper[4036]: W0319 11:51:56.069268 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:56.069534 master-0 kubenswrapper[4036]: E0319 11:51:56.069368 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:56.126790 master-0 kubenswrapper[4036]: W0319 11:51:56.126584 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:56.126790 master-0 kubenswrapper[4036]: E0319 11:51:56.126796 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:56.145942 master-0 kubenswrapper[4036]: I0319 11:51:56.145870 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:56.370956 master-0 kubenswrapper[4036]: W0319 11:51:56.370812 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd664a6d0d2a24360dee10612610f1b59.slice/crio-97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09 WatchSource:0}: Error finding container 97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09: Status 404 returned error can't find the container with id 97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09 Mar 19 11:51:56.372338 master-0 kubenswrapper[4036]: W0319 11:51:56.372287 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fac1b46a11e49501805e891baae4a9.slice/crio-488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a WatchSource:0}: Error finding container 488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a: Status 404 returned error can't find the container with id 488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a Mar 19 11:51:56.378526 master-0 kubenswrapper[4036]: I0319 11:51:56.378505 4036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:51:56.390983 master-0 kubenswrapper[4036]: W0319 11:51:56.390913 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1249822f86f23526277d165c0d5d3c19.slice/crio-2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459 WatchSource:0}: Error finding container 2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459: Status 404 returned error can't find the container with id 2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459 Mar 19 11:51:56.473347 master-0 kubenswrapper[4036]: W0319 11:51:56.473261 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:56.473347 master-0 kubenswrapper[4036]: E0319 11:51:56.473352 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:56.560858 master-0 kubenswrapper[4036]: E0319 11:51:56.560758 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 11:51:56.655773 master-0 kubenswrapper[4036]: W0319 11:51:56.655647 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:56.655773 master-0 kubenswrapper[4036]: E0319 11:51:56.655757 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:56.804430 master-0 kubenswrapper[4036]: I0319 11:51:56.804354 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:56.806330 master-0 kubenswrapper[4036]: I0319 11:51:56.806269 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:56.806330 master-0 kubenswrapper[4036]: I0319 11:51:56.806320 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:56.806330 master-0 kubenswrapper[4036]: I0319 11:51:56.806332 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:56.806519 master-0 kubenswrapper[4036]: I0319 11:51:56.806397 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:56.807729 master-0 kubenswrapper[4036]: E0319 11:51:56.807615 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:51:56.946893 master-0 kubenswrapper[4036]: W0319 11:51:56.946822 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f265536aba6292ead501bc9b49f327.slice/crio-6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a WatchSource:0}: Error finding container 6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a: Status 404 returned error can't find the container with id 6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a Mar 19 11:51:57.145720 master-0 kubenswrapper[4036]: I0319 11:51:57.145649 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:57.195267 master-0 kubenswrapper[4036]: I0319 11:51:57.195106 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:51:57.196711 master-0 kubenswrapper[4036]: E0319 11:51:57.196656 4036 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:57.319494 master-0 kubenswrapper[4036]: I0319 11:51:57.319091 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a"} Mar 19 11:51:57.320911 master-0 kubenswrapper[4036]: I0319 11:51:57.320797 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"eb1ab82ad9ee6b742e302a5269a9952a1cd4daf4667abd8007cfbd42d95d64c7"} Mar 19 11:51:57.321930 master-0 kubenswrapper[4036]: I0319 11:51:57.321895 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459"} Mar 19 11:51:57.322910 master-0 kubenswrapper[4036]: I0319 11:51:57.322866 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09"} Mar 19 11:51:57.324100 master-0 kubenswrapper[4036]: I0319 11:51:57.324042 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a"} Mar 19 11:51:57.689793 master-0 kubenswrapper[4036]: W0319 11:51:57.689730 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:57.690058 master-0 kubenswrapper[4036]: E0319 11:51:57.689808 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:57.804332 master-0 kubenswrapper[4036]: E0319 11:51:57.804149 4036 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189e3bdb076f1682 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.13915149 +0000 UTC m=+0.691572105,LastTimestamp:2026-03-19 11:51:55.13915149 +0000 UTC m=+0.691572105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:51:58.113726 master-0 kubenswrapper[4036]: W0319 11:51:58.113572 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:58.113726 master-0 kubenswrapper[4036]: E0319 11:51:58.113657 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:58.145455 master-0 kubenswrapper[4036]: I0319 11:51:58.145390 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:58.162690 master-0 kubenswrapper[4036]: E0319 11:51:58.162606 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 11:51:58.408700 master-0 kubenswrapper[4036]: I0319 11:51:58.407950 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:58.409392 master-0 kubenswrapper[4036]: I0319 11:51:58.409364 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:58.409456 master-0 kubenswrapper[4036]: I0319 11:51:58.409404 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:58.409456 master-0 kubenswrapper[4036]: I0319 11:51:58.409418 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:58.409529 master-0 kubenswrapper[4036]: I0319 11:51:58.409486 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:51:58.410553 master-0 kubenswrapper[4036]: E0319 11:51:58.410489 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:51:58.839357 master-0 kubenswrapper[4036]: W0319 11:51:58.838847 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:58.839697 master-0 kubenswrapper[4036]: E0319 11:51:58.839373 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:59.146485 master-0 kubenswrapper[4036]: I0319 11:51:59.146437 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:59.241198 master-0 kubenswrapper[4036]: W0319 11:51:59.241147 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:51:59.241198 master-0 kubenswrapper[4036]: E0319 11:51:59.241204 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:51:59.330137 master-0 kubenswrapper[4036]: I0319 11:51:59.329980 4036 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="763a20d7b98822675af380cf6397cfba29e067a2eddf07fc09dc4a05e4d6fa6d" exitCode=0 Mar 19 11:51:59.330137 master-0 kubenswrapper[4036]: I0319 11:51:59.330057 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"763a20d7b98822675af380cf6397cfba29e067a2eddf07fc09dc4a05e4d6fa6d"} Mar 19 11:51:59.330381 master-0 kubenswrapper[4036]: I0319 11:51:59.330196 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:59.332084 master-0 kubenswrapper[4036]: I0319 11:51:59.332037 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:59.332084 master-0 kubenswrapper[4036]: I0319 11:51:59.332066 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:59.332084 master-0 kubenswrapper[4036]: I0319 11:51:59.332077 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:51:59.333848 master-0 kubenswrapper[4036]: I0319 11:51:59.333824 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e"} Mar 19 11:51:59.333919 master-0 kubenswrapper[4036]: I0319 11:51:59.333849 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab"} Mar 19 11:51:59.333965 master-0 kubenswrapper[4036]: I0319 11:51:59.333930 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:51:59.334864 master-0 kubenswrapper[4036]: I0319 11:51:59.334826 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:51:59.334864 master-0 kubenswrapper[4036]: I0319 11:51:59.334863 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:51:59.334957 master-0 kubenswrapper[4036]: I0319 11:51:59.334875 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:00.145177 master-0 kubenswrapper[4036]: I0319 11:52:00.145114 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:00.337772 master-0 kubenswrapper[4036]: I0319 11:52:00.337727 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 11:52:00.338389 master-0 kubenswrapper[4036]: I0319 11:52:00.338156 4036 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="eef578397b65d24f157c007acb286999506a3f26e6a2bb67331d172018c381cb" exitCode=1 Mar 19 11:52:00.338389 master-0 kubenswrapper[4036]: I0319 11:52:00.338261 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:00.338688 master-0 kubenswrapper[4036]: I0319 11:52:00.338674 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:00.338909 master-0 kubenswrapper[4036]: I0319 11:52:00.338886 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"eef578397b65d24f157c007acb286999506a3f26e6a2bb67331d172018c381cb"} Mar 19 11:52:00.339222 master-0 kubenswrapper[4036]: I0319 11:52:00.339206 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:00.339270 master-0 kubenswrapper[4036]: I0319 11:52:00.339230 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:00.339270 master-0 kubenswrapper[4036]: I0319 11:52:00.339238 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:00.339378 master-0 kubenswrapper[4036]: I0319 11:52:00.339333 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:00.339426 master-0 kubenswrapper[4036]: I0319 11:52:00.339386 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:00.339426 master-0 kubenswrapper[4036]: I0319 11:52:00.339399 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:00.339601 master-0 kubenswrapper[4036]: I0319 11:52:00.339586 4036 scope.go:117] "RemoveContainer" containerID="eef578397b65d24f157c007acb286999506a3f26e6a2bb67331d172018c381cb" Mar 19 11:52:01.145829 master-0 kubenswrapper[4036]: I0319 11:52:01.145714 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:01.344315 master-0 kubenswrapper[4036]: I0319 11:52:01.344274 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 11:52:01.344947 master-0 kubenswrapper[4036]: I0319 11:52:01.344799 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/0.log" Mar 19 11:52:01.346492 master-0 kubenswrapper[4036]: I0319 11:52:01.346450 4036 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="5f9bd5f4bb7946a7a98d266917b6484351aca514282a3c8fad2468816b1a00c3" exitCode=1 Mar 19 11:52:01.346548 master-0 kubenswrapper[4036]: I0319 11:52:01.346507 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"5f9bd5f4bb7946a7a98d266917b6484351aca514282a3c8fad2468816b1a00c3"} Mar 19 11:52:01.346584 master-0 kubenswrapper[4036]: I0319 11:52:01.346558 4036 scope.go:117] "RemoveContainer" containerID="eef578397b65d24f157c007acb286999506a3f26e6a2bb67331d172018c381cb" Mar 19 11:52:01.346683 master-0 kubenswrapper[4036]: I0319 11:52:01.346657 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:01.347348 master-0 kubenswrapper[4036]: I0319 11:52:01.347324 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:01.347399 master-0 kubenswrapper[4036]: I0319 11:52:01.347357 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:01.347399 master-0 kubenswrapper[4036]: I0319 11:52:01.347370 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:01.348033 master-0 kubenswrapper[4036]: I0319 11:52:01.347884 4036 scope.go:117] "RemoveContainer" containerID="5f9bd5f4bb7946a7a98d266917b6484351aca514282a3c8fad2468816b1a00c3" Mar 19 11:52:01.348082 master-0 kubenswrapper[4036]: E0319 11:52:01.348053 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:52:01.364220 master-0 kubenswrapper[4036]: E0319 11:52:01.364167 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 11:52:01.385343 master-0 kubenswrapper[4036]: I0319 11:52:01.385287 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:52:01.386748 master-0 kubenswrapper[4036]: E0319 11:52:01.386723 4036 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:52:01.611955 master-0 kubenswrapper[4036]: I0319 11:52:01.611250 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:01.640378 master-0 kubenswrapper[4036]: I0319 11:52:01.640295 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:01.640378 master-0 kubenswrapper[4036]: I0319 11:52:01.640377 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:01.640378 master-0 kubenswrapper[4036]: I0319 11:52:01.640394 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:01.640728 master-0 kubenswrapper[4036]: I0319 11:52:01.640486 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:52:01.641463 master-0 kubenswrapper[4036]: E0319 11:52:01.641421 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 19 11:52:02.145157 master-0 kubenswrapper[4036]: I0319 11:52:02.145054 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:02.351217 master-0 kubenswrapper[4036]: I0319 11:52:02.351142 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 11:52:02.352119 master-0 kubenswrapper[4036]: I0319 11:52:02.352082 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:02.353095 master-0 kubenswrapper[4036]: I0319 11:52:02.353061 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:02.353205 master-0 kubenswrapper[4036]: I0319 11:52:02.353110 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:02.353205 master-0 kubenswrapper[4036]: I0319 11:52:02.353126 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:02.353561 master-0 kubenswrapper[4036]: I0319 11:52:02.353534 4036 scope.go:117] "RemoveContainer" containerID="5f9bd5f4bb7946a7a98d266917b6484351aca514282a3c8fad2468816b1a00c3" Mar 19 11:52:02.353754 master-0 kubenswrapper[4036]: E0319 11:52:02.353725 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:52:02.431547 master-0 kubenswrapper[4036]: W0319 11:52:02.431437 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:02.431547 master-0 kubenswrapper[4036]: E0319 11:52:02.431529 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:52:02.737080 master-0 kubenswrapper[4036]: W0319 11:52:02.736873 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:02.737080 master-0 kubenswrapper[4036]: E0319 11:52:02.736966 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:52:03.145186 master-0 kubenswrapper[4036]: I0319 11:52:03.145000 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:04.145867 master-0 kubenswrapper[4036]: I0319 11:52:04.145764 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:04.485824 master-0 kubenswrapper[4036]: W0319 11:52:04.485609 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:04.485824 master-0 kubenswrapper[4036]: E0319 11:52:04.485718 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:52:04.677547 master-0 kubenswrapper[4036]: W0319 11:52:04.677394 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:04.677547 master-0 kubenswrapper[4036]: E0319 11:52:04.677510 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 19 11:52:05.145546 master-0 kubenswrapper[4036]: I0319 11:52:05.145483 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 19 11:52:05.298309 master-0 kubenswrapper[4036]: E0319 11:52:05.298256 4036 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:52:05.361121 master-0 kubenswrapper[4036]: I0319 11:52:05.361028 4036 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" exitCode=0 Mar 19 11:52:05.361121 master-0 kubenswrapper[4036]: I0319 11:52:05.361114 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4"} Mar 19 11:52:05.361413 master-0 kubenswrapper[4036]: I0319 11:52:05.361244 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:05.362049 master-0 kubenswrapper[4036]: I0319 11:52:05.362010 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:05.362049 master-0 kubenswrapper[4036]: I0319 11:52:05.362041 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:05.362049 master-0 kubenswrapper[4036]: I0319 11:52:05.362050 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:05.363730 master-0 kubenswrapper[4036]: I0319 11:52:05.363692 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a751b53c76eb83162f3d88044f06fc209b871d360365f0332303ca5b041a66eb"} Mar 19 11:52:05.364927 master-0 kubenswrapper[4036]: I0319 11:52:05.364898 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"79f33a58d544fe5d9dca133c52ef1b1ffeca07d3925205b6553e47537ece7b51"} Mar 19 11:52:05.365000 master-0 kubenswrapper[4036]: I0319 11:52:05.364972 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:05.365673 master-0 kubenswrapper[4036]: I0319 11:52:05.365618 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:05.366136 master-0 kubenswrapper[4036]: I0319 11:52:05.366100 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:05.366136 master-0 kubenswrapper[4036]: I0319 11:52:05.366128 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:05.366242 master-0 kubenswrapper[4036]: I0319 11:52:05.366141 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:05.366692 master-0 kubenswrapper[4036]: I0319 11:52:05.366654 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:05.366692 master-0 kubenswrapper[4036]: I0319 11:52:05.366680 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:05.366692 master-0 kubenswrapper[4036]: I0319 11:52:05.366690 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:06.372254 master-0 kubenswrapper[4036]: I0319 11:52:06.371778 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072"} Mar 19 11:52:06.372254 master-0 kubenswrapper[4036]: I0319 11:52:06.371802 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:06.372792 master-0 kubenswrapper[4036]: I0319 11:52:06.372569 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:06.372792 master-0 kubenswrapper[4036]: I0319 11:52:06.372626 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:06.372792 master-0 kubenswrapper[4036]: I0319 11:52:06.372666 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:06.890805 master-0 kubenswrapper[4036]: I0319 11:52:06.889864 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:07.150426 master-0 kubenswrapper[4036]: I0319 11:52:07.150393 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:07.771268 master-0 kubenswrapper[4036]: E0319 11:52:07.771223 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 11:52:07.811185 master-0 kubenswrapper[4036]: E0319 11:52:07.810775 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb076f1682 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.13915149 +0000 UTC m=+0.691572105,LastTimestamp:2026-03-19 11:51:55.13915149 +0000 UTC m=+0.691572105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.815475 master-0 kubenswrapper[4036]: E0319 11:52:07.815356 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.840486 master-0 kubenswrapper[4036]: E0319 11:52:07.839878 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.855299 master-0 kubenswrapper[4036]: E0319 11:52:07.855151 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.860146 master-0 kubenswrapper[4036]: E0319 11:52:07.860038 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb10ea424a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.29821857 +0000 UTC m=+0.850639155,LastTimestamp:2026-03-19 11:51:55.29821857 +0000 UTC m=+0.850639155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.866280 master-0 kubenswrapper[4036]: E0319 11:52:07.866185 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.395057556 +0000 UTC m=+0.947478171,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.884857 master-0 kubenswrapper[4036]: E0319 11:52:07.884253 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.395090947 +0000 UTC m=+0.947511552,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.903672 master-0 kubenswrapper[4036]: E0319 11:52:07.900361 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab68189\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.395107178 +0000 UTC m=+0.947527773,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.913816 master-0 kubenswrapper[4036]: E0319 11:52:07.912948 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.413593928 +0000 UTC m=+0.966014523,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.928740 master-0 kubenswrapper[4036]: E0319 11:52:07.926140 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.413695113 +0000 UTC m=+0.966115708,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.939368 master-0 kubenswrapper[4036]: E0319 11:52:07.939082 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab68189\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.413767476 +0000 UTC m=+0.966188061,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.949060 master-0 kubenswrapper[4036]: E0319 11:52:07.948910 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.415882206 +0000 UTC m=+0.968302791,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.960654 master-0 kubenswrapper[4036]: E0319 11:52:07.957436 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.415894016 +0000 UTC m=+0.968314601,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.963159 master-0 kubenswrapper[4036]: E0319 11:52:07.962068 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab68189\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.415903587 +0000 UTC m=+0.968324172,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.966164 master-0 kubenswrapper[4036]: E0319 11:52:07.966083 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.415949189 +0000 UTC m=+0.968369784,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.970321 master-0 kubenswrapper[4036]: E0319 11:52:07.970239 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.415961299 +0000 UTC m=+0.968381904,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.977322 master-0 kubenswrapper[4036]: E0319 11:52:07.977196 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab68189\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.415997941 +0000 UTC m=+0.968418536,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.983581 master-0 kubenswrapper[4036]: E0319 11:52:07.983447 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.417066031 +0000 UTC m=+0.969486616,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.990697 master-0 kubenswrapper[4036]: E0319 11:52:07.990545 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.417088412 +0000 UTC m=+0.969509007,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:07.995978 master-0 kubenswrapper[4036]: E0319 11:52:07.995692 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.417104193 +0000 UTC m=+0.969524778,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.001799 master-0 kubenswrapper[4036]: E0319 11:52:08.001659 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab68189\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.417114393 +0000 UTC m=+0.969534978,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.006116 master-0 kubenswrapper[4036]: E0319 11:52:08.005982 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.417162786 +0000 UTC m=+0.969583391,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.010415 master-0 kubenswrapper[4036]: E0319 11:52:08.010340 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab68189\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab68189 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194163593 +0000 UTC m=+0.746584218,LastTimestamp:2026-03-19 11:51:55.417175966 +0000 UTC m=+0.969596551,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.015621 master-0 kubenswrapper[4036]: E0319 11:52:08.015551 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab5ae06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab5ae06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194109446 +0000 UTC m=+0.746530061,LastTimestamp:2026-03-19 11:51:55.417775065 +0000 UTC m=+0.970195650,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.019379 master-0 kubenswrapper[4036]: E0319 11:52:08.019296 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189e3bdb0ab636c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189e3bdb0ab636c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:55.194144454 +0000 UTC m=+0.746565079,LastTimestamp:2026-03-19 11:51:55.417790435 +0000 UTC m=+0.970211020,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.025134 master-0 kubenswrapper[4036]: E0319 11:52:08.024904 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdb514d7039 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:56.378460217 +0000 UTC m=+1.930880812,LastTimestamp:2026-03-19 11:51:56.378460217 +0000 UTC m=+1.930880812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.029409 master-0 kubenswrapper[4036]: E0319 11:52:08.029295 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdb514e740a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:56.37852673 +0000 UTC m=+1.930947355,LastTimestamp:2026-03-19 11:51:56.37852673 +0000 UTC m=+1.930947355,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.032952 master-0 kubenswrapper[4036]: E0319 11:52:08.032849 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdb522f844e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:56.393276494 +0000 UTC m=+1.945697099,LastTimestamp:2026-03-19 11:51:56.393276494 +0000 UTC m=+1.945697099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.037822 master-0 kubenswrapper[4036]: E0319 11:52:08.037443 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bdb60e3a46a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:56.639962218 +0000 UTC m=+2.192382803,LastTimestamp:2026-03-19 11:51:56.639962218 +0000 UTC m=+2.192382803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.041961 master-0 kubenswrapper[4036]: I0319 11:52:08.041928 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:08.043281 master-0 kubenswrapper[4036]: E0319 11:52:08.042956 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bdb7352e501 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:56.949243137 +0000 UTC m=+2.501663732,LastTimestamp:2026-03-19 11:51:56.949243137 +0000 UTC m=+2.501663732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.043988 master-0 kubenswrapper[4036]: I0319 11:52:08.043538 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:08.043988 master-0 kubenswrapper[4036]: I0319 11:52:08.043573 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:08.043988 master-0 kubenswrapper[4036]: I0319 11:52:08.043585 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:08.043988 master-0 kubenswrapper[4036]: I0319 11:52:08.043686 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:52:08.049347 master-0 kubenswrapper[4036]: E0319 11:52:08.049323 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 11:52:08.049694 master-0 kubenswrapper[4036]: E0319 11:52:08.049563 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdbdf5e129e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" in 2.368s (2.368s including waiting). Image size: 465090934 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:58.761915038 +0000 UTC m=+4.314335623,LastTimestamp:2026-03-19 11:51:58.761915038 +0000 UTC m=+4.314335623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.053818 master-0 kubenswrapper[4036]: E0319 11:52:08.053412 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdbe08e505f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" in 2.403s (2.403s including waiting). Image size: 529326739 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:58.781853791 +0000 UTC m=+4.334274386,LastTimestamp:2026-03-19 11:51:58.781853791 +0000 UTC m=+4.334274386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.057769 master-0 kubenswrapper[4036]: E0319 11:52:08.057684 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdbec75a02f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:58.981562415 +0000 UTC m=+4.533983000,LastTimestamp:2026-03-19 11:51:58.981562415 +0000 UTC m=+4.533983000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.061842 master-0 kubenswrapper[4036]: E0319 11:52:08.061588 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdbeca5fd0a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:58.984731914 +0000 UTC m=+4.537152499,LastTimestamp:2026-03-19 11:51:58.984731914 +0000 UTC m=+4.537152499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.066518 master-0 kubenswrapper[4036]: E0319 11:52:08.066401 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdbed628b8b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:58.997089163 +0000 UTC m=+4.549509748,LastTimestamp:2026-03-19 11:51:58.997089163 +0000 UTC m=+4.549509748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.071312 master-0 kubenswrapper[4036]: E0319 11:52:08.071153 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdbedb6f0f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.002620149 +0000 UTC m=+4.555040734,LastTimestamp:2026-03-19 11:51:59.002620149 +0000 UTC m=+4.555040734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.075398 master-0 kubenswrapper[4036]: E0319 11:52:08.075328 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdbede40386 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.005574022 +0000 UTC m=+4.557994607,LastTimestamp:2026-03-19 11:51:59.005574022 +0000 UTC m=+4.557994607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.079916 master-0 kubenswrapper[4036]: E0319 11:52:08.079781 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdbf9db3d16 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.206325526 +0000 UTC m=+4.758746111,LastTimestamp:2026-03-19 11:51:59.206325526 +0000 UTC m=+4.758746111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.084253 master-0 kubenswrapper[4036]: E0319 11:52:08.084112 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189e3bdbfad7f0fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:d664a6d0d2a24360dee10612610f1b59,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.222886653 +0000 UTC m=+4.775307238,LastTimestamp:2026-03-19 11:51:59.222886653 +0000 UTC m=+4.775307238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.090958 master-0 kubenswrapper[4036]: E0319 11:52:08.088739 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc018184f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.334663415 +0000 UTC m=+4.887084000,LastTimestamp:2026-03-19 11:51:59.334663415 +0000 UTC m=+4.887084000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.094988 master-0 kubenswrapper[4036]: E0319 11:52:08.094870 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc0d0764e3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.527986403 +0000 UTC m=+5.080406988,LastTimestamp:2026-03-19 11:51:59.527986403 +0000 UTC m=+5.080406988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.100416 master-0 kubenswrapper[4036]: E0319 11:52:08.099713 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc0dab5bbb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.538731963 +0000 UTC m=+5.091152548,LastTimestamp:2026-03-19 11:51:59.538731963 +0000 UTC m=+5.091152548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.106146 master-0 kubenswrapper[4036]: E0319 11:52:08.105970 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc018184f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc018184f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.334663415 +0000 UTC m=+4.887084000,LastTimestamp:2026-03-19 11:52:00.344602733 +0000 UTC m=+5.897023318,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.113225 master-0 kubenswrapper[4036]: E0319 11:52:08.113099 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc0d0764e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc0d0764e3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.527986403 +0000 UTC m=+5.080406988,LastTimestamp:2026-03-19 11:52:00.550219464 +0000 UTC m=+6.102640059,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.118192 master-0 kubenswrapper[4036]: E0319 11:52:08.118069 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc0dab5bbb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc0dab5bbb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.538731963 +0000 UTC m=+5.091152548,LastTimestamp:2026-03-19 11:52:00.563557466 +0000 UTC m=+6.115978051,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.122729 master-0 kubenswrapper[4036]: E0319 11:52:08.122601 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc7982e200 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:01.348018688 +0000 UTC m=+6.900439273,LastTimestamp:2026-03-19 11:52:01.348018688 +0000 UTC m=+6.900439273,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.129151 master-0 kubenswrapper[4036]: E0319 11:52:08.129011 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc7982e200\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc7982e200 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:01.348018688 +0000 UTC m=+6.900439273,LastTimestamp:2026-03-19 11:52:02.353692446 +0000 UTC m=+7.906113031,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.133780 master-0 kubenswrapper[4036]: E0319 11:52:08.133541 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bdd50092506 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 8.307s (8.307s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:04.947141894 +0000 UTC m=+10.499562469,LastTimestamp:2026-03-19 11:52:04.947141894 +0000 UTC m=+10.499562469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.138036 master-0 kubenswrapper[4036]: E0319 11:52:08.137901 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd5091175d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 8.577s (8.577s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:04.956051293 +0000 UTC m=+10.508471878,LastTimestamp:2026-03-19 11:52:04.956051293 +0000 UTC m=+10.508471878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.145228 master-0 kubenswrapper[4036]: I0319 11:52:08.145188 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:08.145228 master-0 kubenswrapper[4036]: E0319 11:52:08.145135 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bdd52eb1915 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" in 8.046s (8.046s including waiting). Image size: 943841779 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:04.995504405 +0000 UTC m=+10.547925030,LastTimestamp:2026-03-19 11:52:04.995504405 +0000 UTC m=+10.547925030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.150617 master-0 kubenswrapper[4036]: E0319 11:52:08.149885 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bdd5c895ebd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.156871869 +0000 UTC m=+10.709292464,LastTimestamp:2026-03-19 11:52:05.156871869 +0000 UTC m=+10.709292464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.156303 master-0 kubenswrapper[4036]: E0319 11:52:08.156169 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd5cd64800 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.16191232 +0000 UTC m=+10.714332905,LastTimestamp:2026-03-19 11:52:05.16191232 +0000 UTC m=+10.714332905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.163316 master-0 kubenswrapper[4036]: E0319 11:52:08.162724 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189e3bdd5d36d022 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:c83737980b9ee109184b1d78e942cf36,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.168238626 +0000 UTC m=+10.720659231,LastTimestamp:2026-03-19 11:52:05.168238626 +0000 UTC m=+10.720659231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.168739 master-0 kubenswrapper[4036]: E0319 11:52:08.168554 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd5d969c06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.174516742 +0000 UTC m=+10.726937327,LastTimestamp:2026-03-19 11:52:05.174516742 +0000 UTC m=+10.726937327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.174231 master-0 kubenswrapper[4036]: E0319 11:52:08.174020 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bdd5fe75148 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.213360456 +0000 UTC m=+10.765781041,LastTimestamp:2026-03-19 11:52:05.213360456 +0000 UTC m=+10.765781041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.178705 master-0 kubenswrapper[4036]: E0319 11:52:08.178573 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bdd607ba19c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.223080348 +0000 UTC m=+10.775500933,LastTimestamp:2026-03-19 11:52:05.223080348 +0000 UTC m=+10.775500933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.183507 master-0 kubenswrapper[4036]: E0319 11:52:08.183357 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bdd608c6bf8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.224180728 +0000 UTC m=+10.776601313,LastTimestamp:2026-03-19 11:52:05.224180728 +0000 UTC m=+10.776601313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.187503 master-0 kubenswrapper[4036]: E0319 11:52:08.187400 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd68f9a5ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.365556666 +0000 UTC m=+10.917977251,LastTimestamp:2026-03-19 11:52:05.365556666 +0000 UTC m=+10.917977251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.192596 master-0 kubenswrapper[4036]: E0319 11:52:08.191811 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd73424713 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.538088723 +0000 UTC m=+11.090509308,LastTimestamp:2026-03-19 11:52:05.538088723 +0000 UTC m=+11.090509308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.196360 master-0 kubenswrapper[4036]: E0319 11:52:08.196136 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd73e9c417 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.549065239 +0000 UTC m=+11.101485824,LastTimestamp:2026-03-19 11:52:05.549065239 +0000 UTC m=+11.101485824,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:08.199656 master-0 kubenswrapper[4036]: E0319 11:52:08.199521 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bdd74019fd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:05.550628823 +0000 UTC m=+11.103049418,LastTimestamp:2026-03-19 11:52:05.550628823 +0000 UTC m=+11.103049418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.011834 master-0 kubenswrapper[4036]: E0319 11:52:09.011297 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bde41f21fa2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" in 3.455s (3.455s including waiting). Image size: 514984269 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:09.005719458 +0000 UTC m=+14.558140043,LastTimestamp:2026-03-19 11:52:09.005719458 +0000 UTC m=+14.558140043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.031089 master-0 kubenswrapper[4036]: E0319 11:52:09.030972 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bde43212778 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\" in 3.801s (3.801s including waiting). Image size: 505246690 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:09.025578872 +0000 UTC m=+14.577999457,LastTimestamp:2026-03-19 11:52:09.025578872 +0000 UTC m=+14.577999457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.149796 master-0 kubenswrapper[4036]: I0319 11:52:09.149742 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:09.182925 master-0 kubenswrapper[4036]: E0319 11:52:09.182772 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bde4c319f7e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:09.177653118 +0000 UTC m=+14.730073703,LastTimestamp:2026-03-19 11:52:09.177653118 +0000 UTC m=+14.730073703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.190325 master-0 kubenswrapper[4036]: E0319 11:52:09.190198 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bde4ca98079 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:09.185509497 +0000 UTC m=+14.737930082,LastTimestamp:2026-03-19 11:52:09.185509497 +0000 UTC m=+14.737930082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.195350 master-0 kubenswrapper[4036]: E0319 11:52:09.195154 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189e3bde4cb18cc4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:49fac1b46a11e49501805e891baae4a9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:09.186036932 +0000 UTC m=+14.738457517,LastTimestamp:2026-03-19 11:52:09.186036932 +0000 UTC m=+14.738457517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.202489 master-0 kubenswrapper[4036]: E0319 11:52:09.202311 4036 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189e3bde4d5dfc43 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:46f265536aba6292ead501bc9b49f327,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:09.197337667 +0000 UTC m=+14.749758252,LastTimestamp:2026-03-19 11:52:09.197337667 +0000 UTC m=+14.749758252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:09.385502 master-0 kubenswrapper[4036]: I0319 11:52:09.385439 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:09.385502 master-0 kubenswrapper[4036]: I0319 11:52:09.385419 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"1dadbe52e08f9953877bf3adbfada05dc57d5eba9ca59d9f0aac897d40194638"} Mar 19 11:52:09.386331 master-0 kubenswrapper[4036]: I0319 11:52:09.386287 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:09.386331 master-0 kubenswrapper[4036]: I0319 11:52:09.386324 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:09.386331 master-0 kubenswrapper[4036]: I0319 11:52:09.386333 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:09.387826 master-0 kubenswrapper[4036]: I0319 11:52:09.387794 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885"} Mar 19 11:52:09.387890 master-0 kubenswrapper[4036]: I0319 11:52:09.387851 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:09.388613 master-0 kubenswrapper[4036]: I0319 11:52:09.388577 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:09.388680 master-0 kubenswrapper[4036]: I0319 11:52:09.388664 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:09.388725 master-0 kubenswrapper[4036]: I0319 11:52:09.388691 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:09.496209 master-0 kubenswrapper[4036]: I0319 11:52:09.495988 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 11:52:09.522839 master-0 kubenswrapper[4036]: I0319 11:52:09.522734 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:09.523099 master-0 kubenswrapper[4036]: I0319 11:52:09.522894 4036 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 11:52:10.150770 master-0 kubenswrapper[4036]: I0319 11:52:10.150726 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:10.391187 master-0 kubenswrapper[4036]: I0319 11:52:10.391117 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:10.391187 master-0 kubenswrapper[4036]: I0319 11:52:10.391179 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:10.392182 master-0 kubenswrapper[4036]: I0319 11:52:10.392147 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:10.392260 master-0 kubenswrapper[4036]: I0319 11:52:10.392193 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:10.392260 master-0 kubenswrapper[4036]: I0319 11:52:10.392207 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:10.392260 master-0 kubenswrapper[4036]: I0319 11:52:10.392166 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:10.392371 master-0 kubenswrapper[4036]: I0319 11:52:10.392272 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:10.392371 master-0 kubenswrapper[4036]: I0319 11:52:10.392286 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:10.565423 master-0 kubenswrapper[4036]: W0319 11:52:10.565282 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 11:52:10.565423 master-0 kubenswrapper[4036]: E0319 11:52:10.565362 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 11:52:11.150046 master-0 kubenswrapper[4036]: I0319 11:52:11.149963 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:11.392939 master-0 kubenswrapper[4036]: I0319 11:52:11.392877 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:11.393861 master-0 kubenswrapper[4036]: I0319 11:52:11.393741 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:11.393861 master-0 kubenswrapper[4036]: I0319 11:52:11.393786 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:11.393861 master-0 kubenswrapper[4036]: I0319 11:52:11.393807 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:12.151236 master-0 kubenswrapper[4036]: I0319 11:52:12.151152 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:12.454734 master-0 kubenswrapper[4036]: I0319 11:52:12.454553 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:12.455308 master-0 kubenswrapper[4036]: I0319 11:52:12.454853 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:12.456450 master-0 kubenswrapper[4036]: I0319 11:52:12.456383 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:12.456525 master-0 kubenswrapper[4036]: I0319 11:52:12.456467 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:12.456525 master-0 kubenswrapper[4036]: I0319 11:52:12.456488 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:13.150064 master-0 kubenswrapper[4036]: I0319 11:52:13.150001 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:13.312238 master-0 kubenswrapper[4036]: I0319 11:52:13.311579 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:13.313537 master-0 kubenswrapper[4036]: I0319 11:52:13.313469 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:13.313537 master-0 kubenswrapper[4036]: I0319 11:52:13.313544 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:13.313859 master-0 kubenswrapper[4036]: I0319 11:52:13.313568 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:13.314298 master-0 kubenswrapper[4036]: I0319 11:52:13.314259 4036 scope.go:117] "RemoveContainer" containerID="5f9bd5f4bb7946a7a98d266917b6484351aca514282a3c8fad2468816b1a00c3" Mar 19 11:52:13.324126 master-0 kubenswrapper[4036]: E0319 11:52:13.323956 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc018184f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc018184f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.334663415 +0000 UTC m=+4.887084000,LastTimestamp:2026-03-19 11:52:13.317731778 +0000 UTC m=+18.870152393,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:13.543851 master-0 kubenswrapper[4036]: E0319 11:52:13.543691 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc0d0764e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc0d0764e3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.527986403 +0000 UTC m=+5.080406988,LastTimestamp:2026-03-19 11:52:13.538155232 +0000 UTC m=+19.090575817,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:13.560721 master-0 kubenswrapper[4036]: E0319 11:52:13.560572 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc0dab5bbb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc0dab5bbb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:51:59.538731963 +0000 UTC m=+5.091152548,LastTimestamp:2026-03-19 11:52:13.551920696 +0000 UTC m=+19.104341281,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:13.589555 master-0 kubenswrapper[4036]: W0319 11:52:13.589505 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 11:52:13.589668 master-0 kubenswrapper[4036]: E0319 11:52:13.589567 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 11:52:13.878541 master-0 kubenswrapper[4036]: I0319 11:52:13.878474 4036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:13.879033 master-0 kubenswrapper[4036]: I0319 11:52:13.878659 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:13.881303 master-0 kubenswrapper[4036]: I0319 11:52:13.880505 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:13.881303 master-0 kubenswrapper[4036]: I0319 11:52:13.880550 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:13.881303 master-0 kubenswrapper[4036]: I0319 11:52:13.880561 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:13.883258 master-0 kubenswrapper[4036]: I0319 11:52:13.883233 4036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:14.022327 master-0 kubenswrapper[4036]: W0319 11:52:14.022259 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:14.022327 master-0 kubenswrapper[4036]: E0319 11:52:14.022324 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 11:52:14.149972 master-0 kubenswrapper[4036]: I0319 11:52:14.149922 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:14.403873 master-0 kubenswrapper[4036]: I0319 11:52:14.403763 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:52:14.404385 master-0 kubenswrapper[4036]: I0319 11:52:14.404307 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/1.log" Mar 19 11:52:14.406087 master-0 kubenswrapper[4036]: I0319 11:52:14.404618 4036 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb" exitCode=1 Mar 19 11:52:14.406087 master-0 kubenswrapper[4036]: I0319 11:52:14.404768 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:14.406087 master-0 kubenswrapper[4036]: I0319 11:52:14.405526 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb"} Mar 19 11:52:14.406087 master-0 kubenswrapper[4036]: I0319 11:52:14.405569 4036 scope.go:117] "RemoveContainer" containerID="5f9bd5f4bb7946a7a98d266917b6484351aca514282a3c8fad2468816b1a00c3" Mar 19 11:52:14.406087 master-0 kubenswrapper[4036]: I0319 11:52:14.405693 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.406751 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.406777 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.406789 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.407063 4036 scope.go:117] "RemoveContainer" containerID="1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.407196 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.407209 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:14.407579 master-0 kubenswrapper[4036]: I0319 11:52:14.407219 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:14.408612 master-0 kubenswrapper[4036]: E0319 11:52:14.408461 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:52:14.410584 master-0 kubenswrapper[4036]: I0319 11:52:14.409975 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:52:14.414334 master-0 kubenswrapper[4036]: E0319 11:52:14.414085 4036 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189e3bdc7982e200\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189e3bdc7982e200 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:1249822f86f23526277d165c0d5d3c19,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:52:01.348018688 +0000 UTC m=+6.900439273,LastTimestamp:2026-03-19 11:52:14.40842801 +0000 UTC m=+19.960848595,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:52:14.737182 master-0 kubenswrapper[4036]: I0319 11:52:14.737045 4036 csr.go:261] certificate signing request csr-l9xn7 is approved, waiting to be issued Mar 19 11:52:14.777668 master-0 kubenswrapper[4036]: E0319 11:52:14.777520 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 11:52:15.051023 master-0 kubenswrapper[4036]: I0319 11:52:15.050800 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:15.052490 master-0 kubenswrapper[4036]: I0319 11:52:15.052239 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:15.052490 master-0 kubenswrapper[4036]: I0319 11:52:15.052293 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:15.052490 master-0 kubenswrapper[4036]: I0319 11:52:15.052312 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:15.052490 master-0 kubenswrapper[4036]: I0319 11:52:15.052391 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:52:15.058211 master-0 kubenswrapper[4036]: E0319 11:52:15.058113 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 11:52:15.149425 master-0 kubenswrapper[4036]: I0319 11:52:15.149381 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:15.298432 master-0 kubenswrapper[4036]: E0319 11:52:15.298392 4036 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:52:15.409915 master-0 kubenswrapper[4036]: I0319 11:52:15.409838 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:52:15.410529 master-0 kubenswrapper[4036]: I0319 11:52:15.410488 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:15.411350 master-0 kubenswrapper[4036]: I0319 11:52:15.411310 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:15.411410 master-0 kubenswrapper[4036]: I0319 11:52:15.411366 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:15.411410 master-0 kubenswrapper[4036]: I0319 11:52:15.411385 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:15.851507 master-0 kubenswrapper[4036]: I0319 11:52:15.851307 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:15.852076 master-0 kubenswrapper[4036]: I0319 11:52:15.851609 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:15.853271 master-0 kubenswrapper[4036]: I0319 11:52:15.853215 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:15.853271 master-0 kubenswrapper[4036]: I0319 11:52:15.853271 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:15.853376 master-0 kubenswrapper[4036]: I0319 11:52:15.853281 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:16.150419 master-0 kubenswrapper[4036]: I0319 11:52:16.150363 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:16.242909 master-0 kubenswrapper[4036]: W0319 11:52:16.242818 4036 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 11:52:16.242909 master-0 kubenswrapper[4036]: E0319 11:52:16.242894 4036 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 11:52:17.076089 master-0 kubenswrapper[4036]: I0319 11:52:17.076007 4036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:17.076788 master-0 kubenswrapper[4036]: I0319 11:52:17.076290 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:17.078009 master-0 kubenswrapper[4036]: I0319 11:52:17.077943 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:17.078092 master-0 kubenswrapper[4036]: I0319 11:52:17.078022 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:17.078092 master-0 kubenswrapper[4036]: I0319 11:52:17.078037 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:17.081136 master-0 kubenswrapper[4036]: I0319 11:52:17.081081 4036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:17.151746 master-0 kubenswrapper[4036]: I0319 11:52:17.151676 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:17.244947 master-0 kubenswrapper[4036]: I0319 11:52:17.244856 4036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:17.249266 master-0 kubenswrapper[4036]: I0319 11:52:17.249204 4036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:17.416119 master-0 kubenswrapper[4036]: I0319 11:52:17.416017 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:17.418036 master-0 kubenswrapper[4036]: I0319 11:52:17.418006 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:17.418188 master-0 kubenswrapper[4036]: I0319 11:52:17.418172 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:17.418296 master-0 kubenswrapper[4036]: I0319 11:52:17.418276 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:17.421247 master-0 kubenswrapper[4036]: I0319 11:52:17.421231 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:18.150363 master-0 kubenswrapper[4036]: I0319 11:52:18.150294 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:18.418925 master-0 kubenswrapper[4036]: I0319 11:52:18.418762 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:18.420048 master-0 kubenswrapper[4036]: I0319 11:52:18.419947 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:18.420048 master-0 kubenswrapper[4036]: I0319 11:52:18.420013 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:18.420048 master-0 kubenswrapper[4036]: I0319 11:52:18.420027 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:19.150053 master-0 kubenswrapper[4036]: I0319 11:52:19.150011 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:19.421398 master-0 kubenswrapper[4036]: I0319 11:52:19.421277 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:19.423103 master-0 kubenswrapper[4036]: I0319 11:52:19.423062 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:19.423103 master-0 kubenswrapper[4036]: I0319 11:52:19.423098 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:19.423103 master-0 kubenswrapper[4036]: I0319 11:52:19.423108 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:19.426113 master-0 kubenswrapper[4036]: I0319 11:52:19.426027 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:52:20.150096 master-0 kubenswrapper[4036]: I0319 11:52:20.149977 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:20.424126 master-0 kubenswrapper[4036]: I0319 11:52:20.423939 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:20.425006 master-0 kubenswrapper[4036]: I0319 11:52:20.424957 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:20.425055 master-0 kubenswrapper[4036]: I0319 11:52:20.425024 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:20.425055 master-0 kubenswrapper[4036]: I0319 11:52:20.425045 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:21.150248 master-0 kubenswrapper[4036]: I0319 11:52:21.150155 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:21.784841 master-0 kubenswrapper[4036]: E0319 11:52:21.784732 4036 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 11:52:22.058981 master-0 kubenswrapper[4036]: I0319 11:52:22.058768 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:22.060221 master-0 kubenswrapper[4036]: I0319 11:52:22.060161 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:22.060221 master-0 kubenswrapper[4036]: I0319 11:52:22.060204 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:22.060221 master-0 kubenswrapper[4036]: I0319 11:52:22.060214 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:22.060551 master-0 kubenswrapper[4036]: I0319 11:52:22.060266 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:52:22.065454 master-0 kubenswrapper[4036]: E0319 11:52:22.065404 4036 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 19 11:52:22.150768 master-0 kubenswrapper[4036]: I0319 11:52:22.150680 4036 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 11:52:22.396263 master-0 kubenswrapper[4036]: I0319 11:52:22.396220 4036 csr.go:257] certificate signing request csr-l9xn7 is issued Mar 19 11:52:23.016082 master-0 kubenswrapper[4036]: I0319 11:52:23.016013 4036 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 11:52:23.179306 master-0 kubenswrapper[4036]: I0319 11:52:23.179237 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:23.195089 master-0 kubenswrapper[4036]: I0319 11:52:23.195033 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:23.264057 master-0 kubenswrapper[4036]: I0319 11:52:23.263956 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:23.398301 master-0 kubenswrapper[4036]: I0319 11:52:23.398175 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 08:54:46.294018881 +0000 UTC Mar 19 11:52:23.398301 master-0 kubenswrapper[4036]: I0319 11:52:23.398285 4036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h2m22.895739439s for next certificate rotation Mar 19 11:52:23.770419 master-0 kubenswrapper[4036]: I0319 11:52:23.770297 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:23.770419 master-0 kubenswrapper[4036]: E0319 11:52:23.770342 4036 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:52:23.800757 master-0 kubenswrapper[4036]: I0319 11:52:23.800693 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:23.815330 master-0 kubenswrapper[4036]: I0319 11:52:23.815273 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:23.881413 master-0 kubenswrapper[4036]: I0319 11:52:23.881353 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:24.152507 master-0 kubenswrapper[4036]: I0319 11:52:24.152434 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:24.152507 master-0 kubenswrapper[4036]: E0319 11:52:24.152486 4036 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:52:24.254859 master-0 kubenswrapper[4036]: I0319 11:52:24.254775 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:24.270193 master-0 kubenswrapper[4036]: I0319 11:52:24.270147 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:24.325340 master-0 kubenswrapper[4036]: I0319 11:52:24.325270 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:24.583605 master-0 kubenswrapper[4036]: I0319 11:52:24.583475 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:24.583605 master-0 kubenswrapper[4036]: E0319 11:52:24.583514 4036 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:52:25.170582 master-0 kubenswrapper[4036]: I0319 11:52:25.170518 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:25.186113 master-0 kubenswrapper[4036]: I0319 11:52:25.186091 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:25.253086 master-0 kubenswrapper[4036]: I0319 11:52:25.253007 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:25.298921 master-0 kubenswrapper[4036]: E0319 11:52:25.298877 4036 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 19 11:52:25.312538 master-0 kubenswrapper[4036]: I0319 11:52:25.312469 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:25.314149 master-0 kubenswrapper[4036]: I0319 11:52:25.314125 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:25.314215 master-0 kubenswrapper[4036]: I0319 11:52:25.314162 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:25.314215 master-0 kubenswrapper[4036]: I0319 11:52:25.314174 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:25.314547 master-0 kubenswrapper[4036]: I0319 11:52:25.314519 4036 scope.go:117] "RemoveContainer" containerID="1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb" Mar 19 11:52:25.314720 master-0 kubenswrapper[4036]: E0319 11:52:25.314695 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(1249822f86f23526277d165c0d5d3c19)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="1249822f86f23526277d165c0d5d3c19" Mar 19 11:52:25.510131 master-0 kubenswrapper[4036]: I0319 11:52:25.510007 4036 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 19 11:52:25.510131 master-0 kubenswrapper[4036]: E0319 11:52:25.510048 4036 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 19 11:52:28.790322 master-0 kubenswrapper[4036]: E0319 11:52:28.790238 4036 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 19 11:52:29.066201 master-0 kubenswrapper[4036]: I0319 11:52:29.066054 4036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:52:29.067412 master-0 kubenswrapper[4036]: I0319 11:52:29.067373 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:52:29.067412 master-0 kubenswrapper[4036]: I0319 11:52:29.067413 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:52:29.067571 master-0 kubenswrapper[4036]: I0319 11:52:29.067425 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:52:29.067571 master-0 kubenswrapper[4036]: I0319 11:52:29.067473 4036 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:52:29.073727 master-0 kubenswrapper[4036]: I0319 11:52:29.073690 4036 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 11:52:29.073944 master-0 kubenswrapper[4036]: E0319 11:52:29.073928 4036 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 19 11:52:29.086276 master-0 kubenswrapper[4036]: E0319 11:52:29.086211 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.172829 master-0 kubenswrapper[4036]: I0319 11:52:29.172771 4036 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 11:52:29.183189 master-0 kubenswrapper[4036]: I0319 11:52:29.183159 4036 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 11:52:29.187291 master-0 kubenswrapper[4036]: E0319 11:52:29.187247 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.288114 master-0 kubenswrapper[4036]: E0319 11:52:29.288001 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.388501 master-0 kubenswrapper[4036]: E0319 11:52:29.388333 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.489407 master-0 kubenswrapper[4036]: E0319 11:52:29.489333 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.589534 master-0 kubenswrapper[4036]: E0319 11:52:29.589456 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.690604 master-0 kubenswrapper[4036]: E0319 11:52:29.690511 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.791053 master-0 kubenswrapper[4036]: E0319 11:52:29.790927 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.892001 master-0 kubenswrapper[4036]: E0319 11:52:29.891918 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:29.993156 master-0 kubenswrapper[4036]: E0319 11:52:29.992938 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.093394 master-0 kubenswrapper[4036]: E0319 11:52:30.093305 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.193881 master-0 kubenswrapper[4036]: E0319 11:52:30.193813 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.294846 master-0 kubenswrapper[4036]: E0319 11:52:30.294600 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.395706 master-0 kubenswrapper[4036]: E0319 11:52:30.395587 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.496249 master-0 kubenswrapper[4036]: E0319 11:52:30.496166 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.597526 master-0 kubenswrapper[4036]: E0319 11:52:30.597356 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.698676 master-0 kubenswrapper[4036]: E0319 11:52:30.698549 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.799559 master-0 kubenswrapper[4036]: E0319 11:52:30.799450 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:30.900483 master-0 kubenswrapper[4036]: E0319 11:52:30.900384 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.001524 master-0 kubenswrapper[4036]: E0319 11:52:31.001441 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.090321 master-0 kubenswrapper[4036]: I0319 11:52:31.090237 4036 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:31.101968 master-0 kubenswrapper[4036]: E0319 11:52:31.101895 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.202488 master-0 kubenswrapper[4036]: E0319 11:52:31.202312 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.303620 master-0 kubenswrapper[4036]: E0319 11:52:31.303520 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.404657 master-0 kubenswrapper[4036]: E0319 11:52:31.404538 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.505480 master-0 kubenswrapper[4036]: E0319 11:52:31.505317 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.605811 master-0 kubenswrapper[4036]: E0319 11:52:31.605677 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.706919 master-0 kubenswrapper[4036]: E0319 11:52:31.706809 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.807860 master-0 kubenswrapper[4036]: E0319 11:52:31.807601 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:31.908818 master-0 kubenswrapper[4036]: E0319 11:52:31.908619 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.009670 master-0 kubenswrapper[4036]: E0319 11:52:32.009575 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.111110 master-0 kubenswrapper[4036]: E0319 11:52:32.110918 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.212080 master-0 kubenswrapper[4036]: E0319 11:52:32.211976 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.313053 master-0 kubenswrapper[4036]: E0319 11:52:32.312954 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.414280 master-0 kubenswrapper[4036]: E0319 11:52:32.414139 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.515187 master-0 kubenswrapper[4036]: E0319 11:52:32.515055 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.616105 master-0 kubenswrapper[4036]: E0319 11:52:32.616003 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.716492 master-0 kubenswrapper[4036]: E0319 11:52:32.716295 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.751936 master-0 kubenswrapper[4036]: I0319 11:52:32.751848 4036 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:32.817284 master-0 kubenswrapper[4036]: E0319 11:52:32.817171 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:32.918162 master-0 kubenswrapper[4036]: E0319 11:52:32.918081 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.018997 master-0 kubenswrapper[4036]: E0319 11:52:33.018818 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.120029 master-0 kubenswrapper[4036]: E0319 11:52:33.119939 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.221246 master-0 kubenswrapper[4036]: E0319 11:52:33.221136 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.321444 master-0 kubenswrapper[4036]: E0319 11:52:33.321249 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.422168 master-0 kubenswrapper[4036]: E0319 11:52:33.422082 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.522961 master-0 kubenswrapper[4036]: E0319 11:52:33.522709 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.623537 master-0 kubenswrapper[4036]: E0319 11:52:33.623394 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.724496 master-0 kubenswrapper[4036]: E0319 11:52:33.724409 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.825207 master-0 kubenswrapper[4036]: E0319 11:52:33.825117 4036 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 19 11:52:33.875853 master-0 kubenswrapper[4036]: I0319 11:52:33.875452 4036 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:34.210803 master-0 kubenswrapper[4036]: I0319 11:52:34.210701 4036 apiserver.go:52] "Watching apiserver" Mar 19 11:52:34.361413 master-0 kubenswrapper[4036]: I0319 11:52:34.361305 4036 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 11:52:34.361735 master-0 kubenswrapper[4036]: I0319 11:52:34.361584 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc","openshift-network-operator/network-operator-7bd846bfc4-5qtx7","assisted-installer/assisted-installer-controller-r6tsg"] Mar 19 11:52:34.362221 master-0 kubenswrapper[4036]: I0319 11:52:34.362176 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.362300 master-0 kubenswrapper[4036]: I0319 11:52:34.362270 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.362777 master-0 kubenswrapper[4036]: I0319 11:52:34.362315 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.368466 master-0 kubenswrapper[4036]: I0319 11:52:34.368391 4036 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 19 11:52:34.368718 master-0 kubenswrapper[4036]: I0319 11:52:34.368495 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:52:34.368888 master-0 kubenswrapper[4036]: I0319 11:52:34.368855 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 19 11:52:34.369285 master-0 kubenswrapper[4036]: I0319 11:52:34.369233 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 19 11:52:34.369394 master-0 kubenswrapper[4036]: I0319 11:52:34.369347 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 19 11:52:34.369892 master-0 kubenswrapper[4036]: I0319 11:52:34.369792 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:52:34.369987 master-0 kubenswrapper[4036]: I0319 11:52:34.369951 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 11:52:34.370075 master-0 kubenswrapper[4036]: I0319 11:52:34.370060 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 11:52:34.370315 master-0 kubenswrapper[4036]: I0319 11:52:34.370287 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:52:34.372582 master-0 kubenswrapper[4036]: I0319 11:52:34.372554 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 11:52:34.456112 master-0 kubenswrapper[4036]: I0319 11:52:34.455788 4036 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 11:52:34.539983 master-0 kubenswrapper[4036]: I0319 11:52:34.539767 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-sno-bootstrap-files\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.539983 master-0 kubenswrapper[4036]: I0319 11:52:34.539849 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.539983 master-0 kubenswrapper[4036]: I0319 11:52:34.539888 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-var-run-resolv-conf\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.539983 master-0 kubenswrapper[4036]: I0319 11:52:34.539922 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-resolv-conf\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540059 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540125 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzfwq\" (UniqueName: \"kubernetes.io/projected/2452a909-6aa1-4750-9399-5945d380427a-kube-api-access-dzfwq\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540169 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540218 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540246 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-ca-bundle\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540270 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.540353 master-0 kubenswrapper[4036]: I0319 11:52:34.540320 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.540621 master-0 kubenswrapper[4036]: I0319 11:52:34.540378 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.540621 master-0 kubenswrapper[4036]: I0319 11:52:34.540410 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.641585 master-0 kubenswrapper[4036]: I0319 11:52:34.641483 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.641585 master-0 kubenswrapper[4036]: I0319 11:52:34.641569 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzfwq\" (UniqueName: \"kubernetes.io/projected/2452a909-6aa1-4750-9399-5945d380427a-kube-api-access-dzfwq\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.641585 master-0 kubenswrapper[4036]: I0319 11:52:34.641594 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.642101 master-0 kubenswrapper[4036]: I0319 11:52:34.641918 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-ca-bundle\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.642152 master-0 kubenswrapper[4036]: I0319 11:52:34.642111 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.642194 master-0 kubenswrapper[4036]: I0319 11:52:34.642113 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-ca-bundle\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.642236 master-0 kubenswrapper[4036]: I0319 11:52:34.642164 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642284 master-0 kubenswrapper[4036]: I0319 11:52:34.642256 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642322 master-0 kubenswrapper[4036]: I0319 11:52:34.642300 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642394 master-0 kubenswrapper[4036]: I0319 11:52:34.642351 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.642394 master-0 kubenswrapper[4036]: I0319 11:52:34.642324 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642470 master-0 kubenswrapper[4036]: I0319 11:52:34.642406 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642470 master-0 kubenswrapper[4036]: I0319 11:52:34.642448 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642536 master-0 kubenswrapper[4036]: I0319 11:52:34.642481 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-var-run-resolv-conf\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.642536 master-0 kubenswrapper[4036]: I0319 11:52:34.642509 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-resolv-conf\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.642536 master-0 kubenswrapper[4036]: I0319 11:52:34.642532 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-sno-bootstrap-files\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.642664 master-0 kubenswrapper[4036]: I0319 11:52:34.642594 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-sno-bootstrap-files\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.642868 master-0 kubenswrapper[4036]: E0319 11:52:34.642820 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:34.642922 master-0 kubenswrapper[4036]: I0319 11:52:34.642872 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.642961 master-0 kubenswrapper[4036]: E0319 11:52:34.642926 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:35.142897682 +0000 UTC m=+40.695318267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:34.642961 master-0 kubenswrapper[4036]: I0319 11:52:34.642933 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-var-run-resolv-conf\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.643053 master-0 kubenswrapper[4036]: I0319 11:52:34.642976 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-resolv-conf\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.643836 master-0 kubenswrapper[4036]: I0319 11:52:34.643800 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.644231 master-0 kubenswrapper[4036]: I0319 11:52:34.644159 4036 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 11:52:34.650550 master-0 kubenswrapper[4036]: I0319 11:52:34.650083 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.720204 master-0 kubenswrapper[4036]: I0319 11:52:34.719894 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:34.720204 master-0 kubenswrapper[4036]: I0319 11:52:34.720159 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzfwq\" (UniqueName: \"kubernetes.io/projected/2452a909-6aa1-4750-9399-5945d380427a-kube-api-access-dzfwq\") pod \"assisted-installer-controller-r6tsg\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:34.720530 master-0 kubenswrapper[4036]: I0319 11:52:34.720336 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:34.980171 master-0 kubenswrapper[4036]: I0319 11:52:34.980105 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:52:35.001654 master-0 kubenswrapper[4036]: I0319 11:52:35.001550 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:35.014483 master-0 kubenswrapper[4036]: W0319 11:52:35.014416 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2452a909_6aa1_4750_9399_5945d380427a.slice/crio-cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff WatchSource:0}: Error finding container cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff: Status 404 returned error can't find the container with id cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff Mar 19 11:52:35.146969 master-0 kubenswrapper[4036]: I0319 11:52:35.146600 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:35.146969 master-0 kubenswrapper[4036]: E0319 11:52:35.146817 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:35.146969 master-0 kubenswrapper[4036]: E0319 11:52:35.146887 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:36.146865862 +0000 UTC m=+41.699286447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:35.459922 master-0 kubenswrapper[4036]: I0319 11:52:35.459829 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6tsg" event={"ID":"2452a909-6aa1-4750-9399-5945d380427a","Type":"ContainerStarted","Data":"cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff"} Mar 19 11:52:35.460888 master-0 kubenswrapper[4036]: I0319 11:52:35.460802 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166"} Mar 19 11:52:35.589931 master-0 kubenswrapper[4036]: I0319 11:52:35.589707 4036 csr.go:261] certificate signing request csr-2r4dp is approved, waiting to be issued Mar 19 11:52:35.612336 master-0 kubenswrapper[4036]: I0319 11:52:35.612227 4036 csr.go:257] certificate signing request csr-2r4dp is issued Mar 19 11:52:36.153455 master-0 kubenswrapper[4036]: I0319 11:52:36.153364 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:36.154485 master-0 kubenswrapper[4036]: E0319 11:52:36.153517 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:36.154485 master-0 kubenswrapper[4036]: E0319 11:52:36.153577 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:38.153561309 +0000 UTC m=+43.705981894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:36.613618 master-0 kubenswrapper[4036]: I0319 11:52:36.613416 4036 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 07:22:09.941546829 +0000 UTC Mar 19 11:52:36.613618 master-0 kubenswrapper[4036]: I0319 11:52:36.613480 4036 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h29m33.328070289s for next certificate rotation Mar 19 11:52:37.614253 master-0 kubenswrapper[4036]: I0319 11:52:37.614183 4036 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 05:24:25.309331326 +0000 UTC Mar 19 11:52:37.614253 master-0 kubenswrapper[4036]: I0319 11:52:37.614225 4036 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h31m47.695109716s for next certificate rotation Mar 19 11:52:37.827067 master-0 kubenswrapper[4036]: I0319 11:52:37.827007 4036 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 11:52:38.170085 master-0 kubenswrapper[4036]: I0319 11:52:38.170026 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:38.171046 master-0 kubenswrapper[4036]: E0319 11:52:38.170197 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:38.171046 master-0 kubenswrapper[4036]: E0319 11:52:38.170274 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:42.170254704 +0000 UTC m=+47.722675289 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:40.480439 master-0 kubenswrapper[4036]: I0319 11:52:40.480367 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"daecc00cd0269edf3ea7180e1cd52281d3c5b230d8b9db4f7b6d733857fc8257"} Mar 19 11:52:40.483280 master-0 kubenswrapper[4036]: I0319 11:52:40.483138 4036 generic.go:334] "Generic (PLEG): container finished" podID="2452a909-6aa1-4750-9399-5945d380427a" containerID="ec93bf01fde32bcc52dde2502ea534c23c521da7002fa0839646b010fe2c3b0c" exitCode=0 Mar 19 11:52:40.483280 master-0 kubenswrapper[4036]: I0319 11:52:40.483192 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6tsg" event={"ID":"2452a909-6aa1-4750-9399-5945d380427a","Type":"ContainerDied","Data":"ec93bf01fde32bcc52dde2502ea534c23c521da7002fa0839646b010fe2c3b0c"} Mar 19 11:52:41.155154 master-0 kubenswrapper[4036]: I0319 11:52:41.153319 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 11:52:41.155154 master-0 kubenswrapper[4036]: I0319 11:52:41.153860 4036 scope.go:117] "RemoveContainer" containerID="1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb" Mar 19 11:52:41.178858 master-0 kubenswrapper[4036]: I0319 11:52:41.178778 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" podStartSLOduration=6.16432456 podStartE2EDuration="11.178758541s" podCreationTimestamp="2026-03-19 11:52:30 +0000 UTC" firstStartedPulling="2026-03-19 11:52:35.003071898 +0000 UTC m=+40.555492483" lastFinishedPulling="2026-03-19 11:52:40.017505879 +0000 UTC m=+45.569926464" observedRunningTime="2026-03-19 11:52:41.178138303 +0000 UTC m=+46.730558908" watchObservedRunningTime="2026-03-19 11:52:41.178758541 +0000 UTC m=+46.731179126" Mar 19 11:52:41.487914 master-0 kubenswrapper[4036]: I0319 11:52:41.487775 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:52:41.490987 master-0 kubenswrapper[4036]: I0319 11:52:41.490926 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"4d3040ecd2a7e633dfd078caf5a95302a7ae4f555f4e48340660df3b62be5004"} Mar 19 11:52:41.510072 master-0 kubenswrapper[4036]: I0319 11:52:41.509389 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=1.509347611 podStartE2EDuration="1.509347611s" podCreationTimestamp="2026-03-19 11:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:52:41.50931904 +0000 UTC m=+47.061739635" watchObservedRunningTime="2026-03-19 11:52:41.509347611 +0000 UTC m=+47.061768196" Mar 19 11:52:41.517375 master-0 kubenswrapper[4036]: I0319 11:52:41.517292 4036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.703891 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-ca-bundle\") pod \"2452a909-6aa1-4750-9399-5945d380427a\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.703979 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzfwq\" (UniqueName: \"kubernetes.io/projected/2452a909-6aa1-4750-9399-5945d380427a-kube-api-access-dzfwq\") pod \"2452a909-6aa1-4750-9399-5945d380427a\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.703998 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-var-run-resolv-conf\") pod \"2452a909-6aa1-4750-9399-5945d380427a\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.704015 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-sno-bootstrap-files\") pod \"2452a909-6aa1-4750-9399-5945d380427a\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.704029 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-resolv-conf\") pod \"2452a909-6aa1-4750-9399-5945d380427a\" (UID: \"2452a909-6aa1-4750-9399-5945d380427a\") " Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.704193 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "2452a909-6aa1-4750-9399-5945d380427a" (UID: "2452a909-6aa1-4750-9399-5945d380427a"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:41.704185 master-0 kubenswrapper[4036]: I0319 11:52:41.704245 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "2452a909-6aa1-4750-9399-5945d380427a" (UID: "2452a909-6aa1-4750-9399-5945d380427a"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:41.704981 master-0 kubenswrapper[4036]: I0319 11:52:41.704799 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "2452a909-6aa1-4750-9399-5945d380427a" (UID: "2452a909-6aa1-4750-9399-5945d380427a"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:41.705647 master-0 kubenswrapper[4036]: I0319 11:52:41.705523 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "2452a909-6aa1-4750-9399-5945d380427a" (UID: "2452a909-6aa1-4750-9399-5945d380427a"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:52:41.709263 master-0 kubenswrapper[4036]: I0319 11:52:41.709179 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2452a909-6aa1-4750-9399-5945d380427a-kube-api-access-dzfwq" (OuterVolumeSpecName: "kube-api-access-dzfwq") pod "2452a909-6aa1-4750-9399-5945d380427a" (UID: "2452a909-6aa1-4750-9399-5945d380427a"). InnerVolumeSpecName "kube-api-access-dzfwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:52:41.805305 master-0 kubenswrapper[4036]: I0319 11:52:41.805162 4036 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:41.805305 master-0 kubenswrapper[4036]: I0319 11:52:41.805206 4036 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:41.805305 master-0 kubenswrapper[4036]: I0319 11:52:41.805216 4036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzfwq\" (UniqueName: \"kubernetes.io/projected/2452a909-6aa1-4750-9399-5945d380427a-kube-api-access-dzfwq\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:41.805305 master-0 kubenswrapper[4036]: I0319 11:52:41.805226 4036 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:41.805305 master-0 kubenswrapper[4036]: I0319 11:52:41.805235 4036 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2452a909-6aa1-4750-9399-5945d380427a-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:42.208003 master-0 kubenswrapper[4036]: I0319 11:52:42.207538 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:42.208301 master-0 kubenswrapper[4036]: E0319 11:52:42.207760 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:42.208301 master-0 kubenswrapper[4036]: E0319 11:52:42.208192 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:50.208155261 +0000 UTC m=+55.760575836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:42.493670 master-0 kubenswrapper[4036]: I0319 11:52:42.493461 4036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:52:42.494574 master-0 kubenswrapper[4036]: I0319 11:52:42.493448 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6tsg" event={"ID":"2452a909-6aa1-4750-9399-5945d380427a","Type":"ContainerDied","Data":"cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff"} Mar 19 11:52:42.494574 master-0 kubenswrapper[4036]: I0319 11:52:42.493733 4036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff" Mar 19 11:52:43.066728 master-0 kubenswrapper[4036]: I0319 11:52:43.066593 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-bl5nl"] Mar 19 11:52:43.066728 master-0 kubenswrapper[4036]: E0319 11:52:43.066742 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 11:52:43.066728 master-0 kubenswrapper[4036]: I0319 11:52:43.066762 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 11:52:43.066728 master-0 kubenswrapper[4036]: I0319 11:52:43.066798 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 11:52:43.067572 master-0 kubenswrapper[4036]: I0319 11:52:43.067083 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:43.217626 master-0 kubenswrapper[4036]: I0319 11:52:43.217528 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75jh\" (UniqueName: \"kubernetes.io/projected/6568c544-a539-4d50-89f0-1705a5f63411-kube-api-access-z75jh\") pod \"mtu-prober-bl5nl\" (UID: \"6568c544-a539-4d50-89f0-1705a5f63411\") " pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:43.318622 master-0 kubenswrapper[4036]: I0319 11:52:43.318388 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75jh\" (UniqueName: \"kubernetes.io/projected/6568c544-a539-4d50-89f0-1705a5f63411-kube-api-access-z75jh\") pod \"mtu-prober-bl5nl\" (UID: \"6568c544-a539-4d50-89f0-1705a5f63411\") " pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:43.342959 master-0 kubenswrapper[4036]: I0319 11:52:43.342866 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75jh\" (UniqueName: \"kubernetes.io/projected/6568c544-a539-4d50-89f0-1705a5f63411-kube-api-access-z75jh\") pod \"mtu-prober-bl5nl\" (UID: \"6568c544-a539-4d50-89f0-1705a5f63411\") " pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:43.385623 master-0 kubenswrapper[4036]: I0319 11:52:43.385482 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:43.497286 master-0 kubenswrapper[4036]: I0319 11:52:43.496922 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-bl5nl" event={"ID":"6568c544-a539-4d50-89f0-1705a5f63411","Type":"ContainerStarted","Data":"64eb4d527ff6202b4280c70b2f4cef7a3669bc686804b7cb047a8352c3664aff"} Mar 19 11:52:44.503462 master-0 kubenswrapper[4036]: I0319 11:52:44.503359 4036 generic.go:334] "Generic (PLEG): container finished" podID="6568c544-a539-4d50-89f0-1705a5f63411" containerID="f2ca0a192f7dfe6f6eb1479534201a334d6844184e4ca66097e40894d0ccce20" exitCode=0 Mar 19 11:52:44.503462 master-0 kubenswrapper[4036]: I0319 11:52:44.503443 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-bl5nl" event={"ID":"6568c544-a539-4d50-89f0-1705a5f63411","Type":"ContainerDied","Data":"f2ca0a192f7dfe6f6eb1479534201a334d6844184e4ca66097e40894d0ccce20"} Mar 19 11:52:45.519701 master-0 kubenswrapper[4036]: I0319 11:52:45.519653 4036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:45.532968 master-0 kubenswrapper[4036]: I0319 11:52:45.532918 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75jh\" (UniqueName: \"kubernetes.io/projected/6568c544-a539-4d50-89f0-1705a5f63411-kube-api-access-z75jh\") pod \"6568c544-a539-4d50-89f0-1705a5f63411\" (UID: \"6568c544-a539-4d50-89f0-1705a5f63411\") " Mar 19 11:52:45.536424 master-0 kubenswrapper[4036]: I0319 11:52:45.536373 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6568c544-a539-4d50-89f0-1705a5f63411-kube-api-access-z75jh" (OuterVolumeSpecName: "kube-api-access-z75jh") pod "6568c544-a539-4d50-89f0-1705a5f63411" (UID: "6568c544-a539-4d50-89f0-1705a5f63411"). InnerVolumeSpecName "kube-api-access-z75jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:52:45.633234 master-0 kubenswrapper[4036]: I0319 11:52:45.633140 4036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75jh\" (UniqueName: \"kubernetes.io/projected/6568c544-a539-4d50-89f0-1705a5f63411-kube-api-access-z75jh\") on node \"master-0\" DevicePath \"\"" Mar 19 11:52:46.509443 master-0 kubenswrapper[4036]: I0319 11:52:46.509388 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-bl5nl" event={"ID":"6568c544-a539-4d50-89f0-1705a5f63411","Type":"ContainerDied","Data":"64eb4d527ff6202b4280c70b2f4cef7a3669bc686804b7cb047a8352c3664aff"} Mar 19 11:52:46.509902 master-0 kubenswrapper[4036]: I0319 11:52:46.509855 4036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64eb4d527ff6202b4280c70b2f4cef7a3669bc686804b7cb047a8352c3664aff" Mar 19 11:52:46.509983 master-0 kubenswrapper[4036]: I0319 11:52:46.509446 4036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-bl5nl" Mar 19 11:52:48.076721 master-0 kubenswrapper[4036]: I0319 11:52:48.076468 4036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-bl5nl"] Mar 19 11:52:48.082665 master-0 kubenswrapper[4036]: I0319 11:52:48.082587 4036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-bl5nl"] Mar 19 11:52:49.316537 master-0 kubenswrapper[4036]: I0319 11:52:49.316452 4036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6568c544-a539-4d50-89f0-1705a5f63411" path="/var/lib/kubelet/pods/6568c544-a539-4d50-89f0-1705a5f63411/volumes" Mar 19 11:52:50.266978 master-0 kubenswrapper[4036]: I0319 11:52:50.266926 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:52:50.267326 master-0 kubenswrapper[4036]: E0319 11:52:50.267144 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:50.267453 master-0 kubenswrapper[4036]: E0319 11:52:50.267441 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:06.267422873 +0000 UTC m=+71.819843458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:52:52.980147 master-0 kubenswrapper[4036]: I0319 11:52:52.980069 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-p5wll"] Mar 19 11:52:52.980843 master-0 kubenswrapper[4036]: E0319 11:52:52.980170 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6568c544-a539-4d50-89f0-1705a5f63411" containerName="prober" Mar 19 11:52:52.980843 master-0 kubenswrapper[4036]: I0319 11:52:52.980184 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6568c544-a539-4d50-89f0-1705a5f63411" containerName="prober" Mar 19 11:52:52.980843 master-0 kubenswrapper[4036]: I0319 11:52:52.980211 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6568c544-a539-4d50-89f0-1705a5f63411" containerName="prober" Mar 19 11:52:52.980843 master-0 kubenswrapper[4036]: I0319 11:52:52.980453 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.983230 master-0 kubenswrapper[4036]: I0319 11:52:52.983204 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 11:52:52.983453 master-0 kubenswrapper[4036]: I0319 11:52:52.983428 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 11:52:52.983592 master-0 kubenswrapper[4036]: I0319 11:52:52.983548 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 11:52:52.983706 master-0 kubenswrapper[4036]: I0319 11:52:52.983687 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 11:52:52.988100 master-0 kubenswrapper[4036]: I0319 11:52:52.988066 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988201 master-0 kubenswrapper[4036]: I0319 11:52:52.988101 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988201 master-0 kubenswrapper[4036]: I0319 11:52:52.988126 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988201 master-0 kubenswrapper[4036]: I0319 11:52:52.988147 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988201 master-0 kubenswrapper[4036]: I0319 11:52:52.988165 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988201 master-0 kubenswrapper[4036]: I0319 11:52:52.988183 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988201 master-0 kubenswrapper[4036]: I0319 11:52:52.988200 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988423 master-0 kubenswrapper[4036]: I0319 11:52:52.988255 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988423 master-0 kubenswrapper[4036]: I0319 11:52:52.988275 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988423 master-0 kubenswrapper[4036]: I0319 11:52:52.988292 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988423 master-0 kubenswrapper[4036]: I0319 11:52:52.988311 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988423 master-0 kubenswrapper[4036]: I0319 11:52:52.988330 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988423 master-0 kubenswrapper[4036]: I0319 11:52:52.988350 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988586 master-0 kubenswrapper[4036]: I0319 11:52:52.988440 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988586 master-0 kubenswrapper[4036]: I0319 11:52:52.988487 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988586 master-0 kubenswrapper[4036]: I0319 11:52:52.988539 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:52.988586 master-0 kubenswrapper[4036]: I0319 11:52:52.988566 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089092 master-0 kubenswrapper[4036]: I0319 11:52:53.089033 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089428 master-0 kubenswrapper[4036]: I0319 11:52:53.089413 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089509 master-0 kubenswrapper[4036]: I0319 11:52:53.089496 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089588 master-0 kubenswrapper[4036]: I0319 11:52:53.089575 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089702 master-0 kubenswrapper[4036]: I0319 11:52:53.089688 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089786 master-0 kubenswrapper[4036]: I0319 11:52:53.089774 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089855 master-0 kubenswrapper[4036]: I0319 11:52:53.089843 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.089924 master-0 kubenswrapper[4036]: I0319 11:52:53.089912 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090014 master-0 kubenswrapper[4036]: I0319 11:52:53.089982 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090125 master-0 kubenswrapper[4036]: I0319 11:52:53.090072 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090173 master-0 kubenswrapper[4036]: I0319 11:52:53.089497 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090173 master-0 kubenswrapper[4036]: I0319 11:52:53.089881 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090173 master-0 kubenswrapper[4036]: I0319 11:52:53.089172 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090284 master-0 kubenswrapper[4036]: I0319 11:52:53.089714 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090284 master-0 kubenswrapper[4036]: I0319 11:52:53.089522 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090284 master-0 kubenswrapper[4036]: I0319 11:52:53.090086 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090284 master-0 kubenswrapper[4036]: I0319 11:52:53.090255 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090391 master-0 kubenswrapper[4036]: I0319 11:52:53.090296 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090391 master-0 kubenswrapper[4036]: I0319 11:52:53.090318 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090391 master-0 kubenswrapper[4036]: I0319 11:52:53.090327 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090391 master-0 kubenswrapper[4036]: I0319 11:52:53.090370 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090398 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090405 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090418 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090438 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090452 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090460 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090501 master-0 kubenswrapper[4036]: I0319 11:52:53.090497 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090714 master-0 kubenswrapper[4036]: I0319 11:52:53.090498 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090714 master-0 kubenswrapper[4036]: I0319 11:52:53.090523 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090714 master-0 kubenswrapper[4036]: I0319 11:52:53.090705 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090793 master-0 kubenswrapper[4036]: I0319 11:52:53.090706 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.090854 master-0 kubenswrapper[4036]: I0319 11:52:53.090839 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.110811 master-0 kubenswrapper[4036]: I0319 11:52:53.110742 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.211304 master-0 kubenswrapper[4036]: I0319 11:52:53.211235 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-n5wqf"] Mar 19 11:52:53.211955 master-0 kubenswrapper[4036]: I0319 11:52:53.211922 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.214318 master-0 kubenswrapper[4036]: I0319 11:52:53.214287 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 11:52:53.214976 master-0 kubenswrapper[4036]: I0319 11:52:53.214927 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 11:52:53.291983 master-0 kubenswrapper[4036]: I0319 11:52:53.291617 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.291983 master-0 kubenswrapper[4036]: I0319 11:52:53.291709 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.291983 master-0 kubenswrapper[4036]: I0319 11:52:53.291751 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.291983 master-0 kubenswrapper[4036]: I0319 11:52:53.291774 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.291983 master-0 kubenswrapper[4036]: I0319 11:52:53.291806 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.291983 master-0 kubenswrapper[4036]: I0319 11:52:53.291831 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.292406 master-0 kubenswrapper[4036]: I0319 11:52:53.291963 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.292406 master-0 kubenswrapper[4036]: I0319 11:52:53.292050 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.293750 master-0 kubenswrapper[4036]: I0319 11:52:53.293676 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p5wll" Mar 19 11:52:53.305789 master-0 kubenswrapper[4036]: W0319 11:52:53.305716 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fed1dcc_6fc0_41ef_8236_e095dca4ddcc.slice/crio-46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b WatchSource:0}: Error finding container 46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b: Status 404 returned error can't find the container with id 46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393475 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393591 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393626 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393682 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393729 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393759 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393783 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393810 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393895 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.393495 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.394243 master-0 kubenswrapper[4036]: I0319 11:52:53.394151 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.395836 master-0 kubenswrapper[4036]: I0319 11:52:53.395788 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.395986 master-0 kubenswrapper[4036]: I0319 11:52:53.395915 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.396313 master-0 kubenswrapper[4036]: I0319 11:52:53.396253 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.397017 master-0 kubenswrapper[4036]: I0319 11:52:53.396967 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.412365 master-0 kubenswrapper[4036]: I0319 11:52:53.412309 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.525042 master-0 kubenswrapper[4036]: I0319 11:52:53.524921 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:52:53.533853 master-0 kubenswrapper[4036]: I0319 11:52:53.533801 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p5wll" event={"ID":"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc","Type":"ContainerStarted","Data":"46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b"} Mar 19 11:52:53.539180 master-0 kubenswrapper[4036]: W0319 11:52:53.539080 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc479beae_6d34_4d49_81c8_39f6a53ff6ca.slice/crio-cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379 WatchSource:0}: Error finding container cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379: Status 404 returned error can't find the container with id cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379 Mar 19 11:52:53.937693 master-0 kubenswrapper[4036]: I0319 11:52:53.937622 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nkd77"] Mar 19 11:52:53.938192 master-0 kubenswrapper[4036]: I0319 11:52:53.938170 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:53.938279 master-0 kubenswrapper[4036]: E0319 11:52:53.938252 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:52:53.998620 master-0 kubenswrapper[4036]: I0319 11:52:53.998543 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:53.998620 master-0 kubenswrapper[4036]: I0319 11:52:53.998603 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:54.099532 master-0 kubenswrapper[4036]: I0319 11:52:54.099439 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:54.099532 master-0 kubenswrapper[4036]: I0319 11:52:54.099518 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:54.100279 master-0 kubenswrapper[4036]: E0319 11:52:54.100251 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:54.100418 master-0 kubenswrapper[4036]: E0319 11:52:54.100327 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:54.600309967 +0000 UTC m=+60.152730552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:54.132368 master-0 kubenswrapper[4036]: I0319 11:52:54.132303 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:54.537259 master-0 kubenswrapper[4036]: I0319 11:52:54.537187 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerStarted","Data":"cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379"} Mar 19 11:52:54.604776 master-0 kubenswrapper[4036]: I0319 11:52:54.604705 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:54.605030 master-0 kubenswrapper[4036]: E0319 11:52:54.604961 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:54.605124 master-0 kubenswrapper[4036]: E0319 11:52:54.605098 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:55.60507298 +0000 UTC m=+61.157493565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:55.615251 master-0 kubenswrapper[4036]: I0319 11:52:55.614901 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:55.619488 master-0 kubenswrapper[4036]: E0319 11:52:55.619433 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:55.619673 master-0 kubenswrapper[4036]: E0319 11:52:55.619563 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:52:57.619532212 +0000 UTC m=+63.171952797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:56.312201 master-0 kubenswrapper[4036]: I0319 11:52:56.312133 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:56.312586 master-0 kubenswrapper[4036]: E0319 11:52:56.312313 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:52:57.549133 master-0 kubenswrapper[4036]: I0319 11:52:57.548902 4036 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="7e880bfa7477300228e9f011e131c05d9e3d415f1d3d007729e1d9e7b578b4cc" exitCode=0 Mar 19 11:52:57.549133 master-0 kubenswrapper[4036]: I0319 11:52:57.548965 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"7e880bfa7477300228e9f011e131c05d9e3d415f1d3d007729e1d9e7b578b4cc"} Mar 19 11:52:57.633655 master-0 kubenswrapper[4036]: I0319 11:52:57.633527 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:57.634018 master-0 kubenswrapper[4036]: E0319 11:52:57.633854 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:57.634018 master-0 kubenswrapper[4036]: E0319 11:52:57.633981 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:01.633955015 +0000 UTC m=+67.186375600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:52:58.312311 master-0 kubenswrapper[4036]: I0319 11:52:58.312251 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:52:58.312614 master-0 kubenswrapper[4036]: E0319 11:52:58.312438 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:00.312342 master-0 kubenswrapper[4036]: I0319 11:53:00.312209 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:00.313377 master-0 kubenswrapper[4036]: E0319 11:53:00.312390 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:01.666562 master-0 kubenswrapper[4036]: I0319 11:53:01.666488 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:01.667764 master-0 kubenswrapper[4036]: E0319 11:53:01.666657 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:53:01.667764 master-0 kubenswrapper[4036]: E0319 11:53:01.666740 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:09.66672141 +0000 UTC m=+75.219141995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:53:02.312558 master-0 kubenswrapper[4036]: I0319 11:53:02.312069 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:02.312960 master-0 kubenswrapper[4036]: E0319 11:53:02.312664 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:04.312044 master-0 kubenswrapper[4036]: I0319 11:53:04.311955 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:04.313559 master-0 kubenswrapper[4036]: E0319 11:53:04.312177 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:05.339377 master-0 kubenswrapper[4036]: I0319 11:53:05.339319 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj"] Mar 19 11:53:05.339965 master-0 kubenswrapper[4036]: I0319 11:53:05.339731 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.343529 master-0 kubenswrapper[4036]: I0319 11:53:05.342281 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 11:53:05.343529 master-0 kubenswrapper[4036]: I0319 11:53:05.342340 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 11:53:05.343529 master-0 kubenswrapper[4036]: I0319 11:53:05.342564 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 11:53:05.343529 master-0 kubenswrapper[4036]: I0319 11:53:05.342622 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 11:53:05.350106 master-0 kubenswrapper[4036]: I0319 11:53:05.349844 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 11:53:05.399596 master-0 kubenswrapper[4036]: I0319 11:53:05.399396 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.399596 master-0 kubenswrapper[4036]: I0319 11:53:05.399487 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.399596 master-0 kubenswrapper[4036]: I0319 11:53:05.399517 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.403353 master-0 kubenswrapper[4036]: I0319 11:53:05.403288 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.504211 master-0 kubenswrapper[4036]: I0319 11:53:05.504125 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.504579 master-0 kubenswrapper[4036]: I0319 11:53:05.504555 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.504741 master-0 kubenswrapper[4036]: I0319 11:53:05.504720 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.504887 master-0 kubenswrapper[4036]: I0319 11:53:05.504871 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.505684 master-0 kubenswrapper[4036]: I0319 11:53:05.505665 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.506106 master-0 kubenswrapper[4036]: I0319 11:53:05.506044 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.509761 master-0 kubenswrapper[4036]: I0319 11:53:05.509718 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.531537 master-0 kubenswrapper[4036]: I0319 11:53:05.531462 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.550066 master-0 kubenswrapper[4036]: I0319 11:53:05.550007 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sr4bh"] Mar 19 11:53:05.550898 master-0 kubenswrapper[4036]: I0319 11:53:05.550866 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.552919 master-0 kubenswrapper[4036]: I0319 11:53:05.552862 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:53:05.553452 master-0 kubenswrapper[4036]: I0319 11:53:05.553281 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.605884 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-netns\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.605950 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-env-overrides\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.605971 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-etc-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.605990 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-var-lib-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.606011 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e058de7-df45-4cf4-a90a-149f90f891d6-ovn-node-metrics-cert\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.606031 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-slash\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606059 master-0 kubenswrapper[4036]: I0319 11:53:05.606067 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606084 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-node-log\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606107 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-bin\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606132 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-systemd-units\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606159 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606186 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-ovn\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606203 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-config\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606221 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6xn\" (UniqueName: \"kubernetes.io/projected/6e058de7-df45-4cf4-a90a-149f90f891d6-kube-api-access-hm6xn\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606239 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-kubelet\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606292 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-log-socket\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606314 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606337 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-netd\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606354 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-systemd\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.606465 master-0 kubenswrapper[4036]: I0319 11:53:05.606374 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-script-lib\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.663078 master-0 kubenswrapper[4036]: I0319 11:53:05.662982 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:53:05.707577 master-0 kubenswrapper[4036]: I0319 11:53:05.707523 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-systemd\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.707894 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-script-lib\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.708812 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-script-lib\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.708812 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-netns\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.708858 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-env-overrides\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.708855 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-netns\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.708878 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-etc-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.708911 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-var-lib-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.708993 master-0 kubenswrapper[4036]: I0319 11:53:05.707729 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-systemd\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709053 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-var-lib-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709102 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e058de7-df45-4cf4-a90a-149f90f891d6-ovn-node-metrics-cert\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709133 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-slash\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709173 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709191 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-node-log\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709181 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-etc-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709225 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-slash\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709242 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-openvswitch\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709268 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-node-log\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709309 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-bin\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709342 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-systemd-units\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709364 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709400 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-ovn\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709420 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-config\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709460 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6xn\" (UniqueName: \"kubernetes.io/projected/6e058de7-df45-4cf4-a90a-149f90f891d6-kube-api-access-hm6xn\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709484 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-kubelet\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.709512 master-0 kubenswrapper[4036]: I0319 11:53:05.709519 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-log-socket\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709543 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709573 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-netd\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709663 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-netd\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709695 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-bin\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709718 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-systemd-units\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709729 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-env-overrides\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709756 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709779 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-kubelet\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709801 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-log-socket\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709820 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-ovn\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.710098 master-0 kubenswrapper[4036]: I0319 11:53:05.709815 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.711863 master-0 kubenswrapper[4036]: I0319 11:53:05.710426 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-config\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.713036 master-0 kubenswrapper[4036]: I0319 11:53:05.712848 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e058de7-df45-4cf4-a90a-149f90f891d6-ovn-node-metrics-cert\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.741322 master-0 kubenswrapper[4036]: I0319 11:53:05.741259 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6xn\" (UniqueName: \"kubernetes.io/projected/6e058de7-df45-4cf4-a90a-149f90f891d6-kube-api-access-hm6xn\") pod \"ovnkube-node-sr4bh\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:05.864264 master-0 kubenswrapper[4036]: I0319 11:53:05.863781 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:06.312070 master-0 kubenswrapper[4036]: I0319 11:53:06.312014 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:06.312349 master-0 kubenswrapper[4036]: E0319 11:53:06.312167 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:06.318059 master-0 kubenswrapper[4036]: I0319 11:53:06.315381 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:53:06.318059 master-0 kubenswrapper[4036]: E0319 11:53:06.315691 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:06.318059 master-0 kubenswrapper[4036]: E0319 11:53:06.315771 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:38.315744623 +0000 UTC m=+103.868165208 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:07.453486 master-0 kubenswrapper[4036]: W0319 11:53:07.453378 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e058de7_df45_4cf4_a90a_149f90f891d6.slice/crio-87a26785d0f440b4da4ede0f74f0ae1d183f1c3c8fb96a9d3d4bbedaebbb47ec WatchSource:0}: Error finding container 87a26785d0f440b4da4ede0f74f0ae1d183f1c3c8fb96a9d3d4bbedaebbb47ec: Status 404 returned error can't find the container with id 87a26785d0f440b4da4ede0f74f0ae1d183f1c3c8fb96a9d3d4bbedaebbb47ec Mar 19 11:53:07.576372 master-0 kubenswrapper[4036]: I0319 11:53:07.576269 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f"} Mar 19 11:53:07.577875 master-0 kubenswrapper[4036]: I0319 11:53:07.577831 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"87a26785d0f440b4da4ede0f74f0ae1d183f1c3c8fb96a9d3d4bbedaebbb47ec"} Mar 19 11:53:08.312399 master-0 kubenswrapper[4036]: I0319 11:53:08.312258 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:08.312621 master-0 kubenswrapper[4036]: E0319 11:53:08.312409 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:08.541206 master-0 kubenswrapper[4036]: I0319 11:53:08.541094 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-tz4qg"] Mar 19 11:53:08.541930 master-0 kubenswrapper[4036]: I0319 11:53:08.541575 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:08.541930 master-0 kubenswrapper[4036]: E0319 11:53:08.541672 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:08.581914 master-0 kubenswrapper[4036]: I0319 11:53:08.581783 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"9285536db09b88ca539506b964929ed4bab9e9f5af82602c25ae73597e6b0465"} Mar 19 11:53:08.584064 master-0 kubenswrapper[4036]: I0319 11:53:08.584036 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p5wll" event={"ID":"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc","Type":"ContainerStarted","Data":"4672e75211f24f87471dd8f82717ac1ac6c96e914525bedb7525f49512ca1f3b"} Mar 19 11:53:08.587379 master-0 kubenswrapper[4036]: I0319 11:53:08.587327 4036 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="57e2b9e5df1e4dbc4f600dfa7e9acded55e04fe34916a6250fcf8bdcd8821c9c" exitCode=0 Mar 19 11:53:08.587379 master-0 kubenswrapper[4036]: I0319 11:53:08.587368 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"57e2b9e5df1e4dbc4f600dfa7e9acded55e04fe34916a6250fcf8bdcd8821c9c"} Mar 19 11:53:08.605572 master-0 kubenswrapper[4036]: I0319 11:53:08.605479 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p5wll" podStartSLOduration=2.372718714 podStartE2EDuration="16.605452934s" podCreationTimestamp="2026-03-19 11:52:52 +0000 UTC" firstStartedPulling="2026-03-19 11:52:53.308428207 +0000 UTC m=+58.860848792" lastFinishedPulling="2026-03-19 11:53:07.541162427 +0000 UTC m=+73.093583012" observedRunningTime="2026-03-19 11:53:08.604891329 +0000 UTC m=+74.157311924" watchObservedRunningTime="2026-03-19 11:53:08.605452934 +0000 UTC m=+74.157873519" Mar 19 11:53:08.637924 master-0 kubenswrapper[4036]: I0319 11:53:08.637854 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:08.738615 master-0 kubenswrapper[4036]: I0319 11:53:08.738562 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:08.756934 master-0 kubenswrapper[4036]: E0319 11:53:08.756894 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:08.756934 master-0 kubenswrapper[4036]: E0319 11:53:08.756929 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:08.756934 master-0 kubenswrapper[4036]: E0319 11:53:08.756947 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:08.757204 master-0 kubenswrapper[4036]: E0319 11:53:08.757015 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:09.256998359 +0000 UTC m=+74.809418934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:09.343038 master-0 kubenswrapper[4036]: I0319 11:53:09.342970 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:09.343326 master-0 kubenswrapper[4036]: E0319 11:53:09.343209 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:09.343326 master-0 kubenswrapper[4036]: E0319 11:53:09.343241 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:09.343326 master-0 kubenswrapper[4036]: E0319 11:53:09.343257 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:09.343424 master-0 kubenswrapper[4036]: E0319 11:53:09.343331 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:10.343308311 +0000 UTC m=+75.895728896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:09.747527 master-0 kubenswrapper[4036]: I0319 11:53:09.747443 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:09.748256 master-0 kubenswrapper[4036]: E0319 11:53:09.747692 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:53:09.748256 master-0 kubenswrapper[4036]: E0319 11:53:09.747770 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:25.747751284 +0000 UTC m=+91.300171869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:53:10.312763 master-0 kubenswrapper[4036]: I0319 11:53:10.312521 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:10.312763 master-0 kubenswrapper[4036]: I0319 11:53:10.312558 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:10.312763 master-0 kubenswrapper[4036]: E0319 11:53:10.312731 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:10.313194 master-0 kubenswrapper[4036]: E0319 11:53:10.312877 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:10.353390 master-0 kubenswrapper[4036]: I0319 11:53:10.353291 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:10.353697 master-0 kubenswrapper[4036]: E0319 11:53:10.353573 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:10.353697 master-0 kubenswrapper[4036]: E0319 11:53:10.353655 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:10.353697 master-0 kubenswrapper[4036]: E0319 11:53:10.353674 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:10.353825 master-0 kubenswrapper[4036]: E0319 11:53:10.353792 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:12.353743877 +0000 UTC m=+77.906164512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:11.142632 master-0 kubenswrapper[4036]: I0319 11:53:11.140418 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-47j68"] Mar 19 11:53:11.142632 master-0 kubenswrapper[4036]: I0319 11:53:11.141445 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.144746 master-0 kubenswrapper[4036]: I0319 11:53:11.144663 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 11:53:11.145402 master-0 kubenswrapper[4036]: I0319 11:53:11.144950 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 11:53:11.145402 master-0 kubenswrapper[4036]: I0319 11:53:11.144948 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 11:53:11.145832 master-0 kubenswrapper[4036]: I0319 11:53:11.145714 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 11:53:11.146869 master-0 kubenswrapper[4036]: I0319 11:53:11.146829 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 11:53:11.159908 master-0 kubenswrapper[4036]: I0319 11:53:11.159702 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.159908 master-0 kubenswrapper[4036]: I0319 11:53:11.159755 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.159908 master-0 kubenswrapper[4036]: I0319 11:53:11.159798 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.159908 master-0 kubenswrapper[4036]: I0319 11:53:11.159827 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.264262 master-0 kubenswrapper[4036]: I0319 11:53:11.260659 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.264262 master-0 kubenswrapper[4036]: I0319 11:53:11.260833 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.264262 master-0 kubenswrapper[4036]: I0319 11:53:11.260874 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.264262 master-0 kubenswrapper[4036]: I0319 11:53:11.260913 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.264262 master-0 kubenswrapper[4036]: I0319 11:53:11.263382 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.264262 master-0 kubenswrapper[4036]: I0319 11:53:11.264190 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.270748 master-0 kubenswrapper[4036]: I0319 11:53:11.270437 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.282953 master-0 kubenswrapper[4036]: I0319 11:53:11.282885 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.458715 master-0 kubenswrapper[4036]: I0319 11:53:11.458571 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:53:11.471941 master-0 kubenswrapper[4036]: W0319 11:53:11.471898 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e2c1d1_a50f_4a0d_8273_440e699b3907.slice/crio-a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f WatchSource:0}: Error finding container a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f: Status 404 returned error can't find the container with id a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f Mar 19 11:53:11.599561 master-0 kubenswrapper[4036]: I0319 11:53:11.599499 4036 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="20f4956468238d356cba6c4e8281a37386fede658fd1e93306b89426a80c8130" exitCode=0 Mar 19 11:53:11.599870 master-0 kubenswrapper[4036]: I0319 11:53:11.599584 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"20f4956468238d356cba6c4e8281a37386fede658fd1e93306b89426a80c8130"} Mar 19 11:53:11.602176 master-0 kubenswrapper[4036]: I0319 11:53:11.602144 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f"} Mar 19 11:53:12.312027 master-0 kubenswrapper[4036]: I0319 11:53:12.311955 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:12.312876 master-0 kubenswrapper[4036]: I0319 11:53:12.312012 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:12.312876 master-0 kubenswrapper[4036]: E0319 11:53:12.312232 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:12.312876 master-0 kubenswrapper[4036]: E0319 11:53:12.312383 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:12.330202 master-0 kubenswrapper[4036]: W0319 11:53:12.330146 4036 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 11:53:12.330514 master-0 kubenswrapper[4036]: I0319 11:53:12.330305 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:53:12.371137 master-0 kubenswrapper[4036]: I0319 11:53:12.371014 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:12.371441 master-0 kubenswrapper[4036]: E0319 11:53:12.371288 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:12.371441 master-0 kubenswrapper[4036]: E0319 11:53:12.371341 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:12.371441 master-0 kubenswrapper[4036]: E0319 11:53:12.371361 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:12.371441 master-0 kubenswrapper[4036]: E0319 11:53:12.371440 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:16.371418304 +0000 UTC m=+81.923838889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:14.312283 master-0 kubenswrapper[4036]: I0319 11:53:14.312207 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:14.313288 master-0 kubenswrapper[4036]: I0319 11:53:14.312283 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:14.313288 master-0 kubenswrapper[4036]: E0319 11:53:14.312370 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:14.313288 master-0 kubenswrapper[4036]: E0319 11:53:14.312468 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:16.035087 master-0 kubenswrapper[4036]: I0319 11:53:16.034949 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=4.034904177 podStartE2EDuration="4.034904177s" podCreationTimestamp="2026-03-19 11:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:16.03465423 +0000 UTC m=+81.587074825" watchObservedRunningTime="2026-03-19 11:53:16.034904177 +0000 UTC m=+81.587324762" Mar 19 11:53:16.312457 master-0 kubenswrapper[4036]: I0319 11:53:16.312226 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:16.312457 master-0 kubenswrapper[4036]: I0319 11:53:16.312409 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:16.312788 master-0 kubenswrapper[4036]: E0319 11:53:16.312542 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:16.312788 master-0 kubenswrapper[4036]: E0319 11:53:16.312737 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:16.407649 master-0 kubenswrapper[4036]: I0319 11:53:16.407156 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:16.407649 master-0 kubenswrapper[4036]: E0319 11:53:16.407339 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:16.407649 master-0 kubenswrapper[4036]: E0319 11:53:16.407359 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:16.407649 master-0 kubenswrapper[4036]: E0319 11:53:16.407372 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:16.407649 master-0 kubenswrapper[4036]: E0319 11:53:16.407432 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:24.407417784 +0000 UTC m=+89.959838369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:17.620440 master-0 kubenswrapper[4036]: I0319 11:53:17.620341 4036 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="ff56157a2c5ab60201123fe865c3891754204ecbb4e8ba10184f66ac03510978" exitCode=0 Mar 19 11:53:17.620440 master-0 kubenswrapper[4036]: I0319 11:53:17.620393 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"ff56157a2c5ab60201123fe865c3891754204ecbb4e8ba10184f66ac03510978"} Mar 19 11:53:18.312235 master-0 kubenswrapper[4036]: I0319 11:53:18.312164 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:18.312235 master-0 kubenswrapper[4036]: I0319 11:53:18.312224 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:18.312544 master-0 kubenswrapper[4036]: E0319 11:53:18.312417 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:18.312759 master-0 kubenswrapper[4036]: E0319 11:53:18.312674 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:18.484899 master-0 kubenswrapper[4036]: I0319 11:53:18.484242 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:53:20.124006 master-0 kubenswrapper[4036]: I0319 11:53:20.123756 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=2.123731828 podStartE2EDuration="2.123731828s" podCreationTimestamp="2026-03-19 11:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:20.123345977 +0000 UTC m=+85.675766572" watchObservedRunningTime="2026-03-19 11:53:20.123731828 +0000 UTC m=+85.676152413" Mar 19 11:53:20.312128 master-0 kubenswrapper[4036]: I0319 11:53:20.312055 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:20.312439 master-0 kubenswrapper[4036]: I0319 11:53:20.312130 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:20.312439 master-0 kubenswrapper[4036]: E0319 11:53:20.312208 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:20.316744 master-0 kubenswrapper[4036]: E0319 11:53:20.312297 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:22.312573 master-0 kubenswrapper[4036]: I0319 11:53:22.312508 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:22.313425 master-0 kubenswrapper[4036]: I0319 11:53:22.312601 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:22.313425 master-0 kubenswrapper[4036]: E0319 11:53:22.312703 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:22.313425 master-0 kubenswrapper[4036]: E0319 11:53:22.312784 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:22.324774 master-0 kubenswrapper[4036]: I0319 11:53:22.324673 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 11:53:24.311804 master-0 kubenswrapper[4036]: I0319 11:53:24.311733 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:24.311804 master-0 kubenswrapper[4036]: I0319 11:53:24.311803 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:24.312685 master-0 kubenswrapper[4036]: E0319 11:53:24.311889 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:24.312685 master-0 kubenswrapper[4036]: E0319 11:53:24.312090 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:24.413180 master-0 kubenswrapper[4036]: I0319 11:53:24.413099 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:24.413507 master-0 kubenswrapper[4036]: E0319 11:53:24.413350 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:24.413507 master-0 kubenswrapper[4036]: E0319 11:53:24.413437 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:24.413507 master-0 kubenswrapper[4036]: E0319 11:53:24.413453 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:24.413721 master-0 kubenswrapper[4036]: E0319 11:53:24.413532 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:40.413514361 +0000 UTC m=+105.965934946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:25.338808 master-0 kubenswrapper[4036]: I0319 11:53:25.338731 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 11:53:25.825778 master-0 kubenswrapper[4036]: I0319 11:53:25.825350 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:25.825778 master-0 kubenswrapper[4036]: E0319 11:53:25.825545 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:53:25.825778 master-0 kubenswrapper[4036]: E0319 11:53:25.825621 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:57.825599395 +0000 UTC m=+123.378019980 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 11:53:26.312560 master-0 kubenswrapper[4036]: I0319 11:53:26.312471 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:26.312945 master-0 kubenswrapper[4036]: E0319 11:53:26.312649 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:26.313180 master-0 kubenswrapper[4036]: I0319 11:53:26.313136 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:26.313291 master-0 kubenswrapper[4036]: E0319 11:53:26.313269 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:27.001418 master-0 kubenswrapper[4036]: I0319 11:53:27.001334 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0" exitCode=0 Mar 19 11:53:27.002570 master-0 kubenswrapper[4036]: I0319 11:53:27.001509 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0"} Mar 19 11:53:27.003753 master-0 kubenswrapper[4036]: I0319 11:53:27.003711 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"f7738588625037912affdf5435640d17413e9cd9036722fe4bd36750bce1dc90"} Mar 19 11:53:27.025531 master-0 kubenswrapper[4036]: I0319 11:53:27.025326 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=5.025273643 podStartE2EDuration="5.025273643s" podCreationTimestamp="2026-03-19 11:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:25.340011111 +0000 UTC m=+90.892431706" watchObservedRunningTime="2026-03-19 11:53:27.025273643 +0000 UTC m=+92.577694238" Mar 19 11:53:27.055809 master-0 kubenswrapper[4036]: I0319 11:53:27.048712 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=2.048687886 podStartE2EDuration="2.048687886s" podCreationTimestamp="2026-03-19 11:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:27.044478791 +0000 UTC m=+92.596899396" watchObservedRunningTime="2026-03-19 11:53:27.048687886 +0000 UTC m=+92.601108471" Mar 19 11:53:28.012790 master-0 kubenswrapper[4036]: I0319 11:53:28.012219 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1"} Mar 19 11:53:28.012790 master-0 kubenswrapper[4036]: I0319 11:53:28.012760 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844"} Mar 19 11:53:28.012790 master-0 kubenswrapper[4036]: I0319 11:53:28.012774 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f"} Mar 19 11:53:28.012790 master-0 kubenswrapper[4036]: I0319 11:53:28.012785 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541"} Mar 19 11:53:28.012790 master-0 kubenswrapper[4036]: I0319 11:53:28.012799 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4"} Mar 19 11:53:28.012790 master-0 kubenswrapper[4036]: I0319 11:53:28.012810 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562"} Mar 19 11:53:28.312505 master-0 kubenswrapper[4036]: I0319 11:53:28.311833 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:28.312505 master-0 kubenswrapper[4036]: E0319 11:53:28.312501 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:28.313180 master-0 kubenswrapper[4036]: I0319 11:53:28.313148 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:28.313264 master-0 kubenswrapper[4036]: E0319 11:53:28.313233 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:30.311850 master-0 kubenswrapper[4036]: I0319 11:53:30.311790 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:30.312570 master-0 kubenswrapper[4036]: I0319 11:53:30.311935 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:30.312570 master-0 kubenswrapper[4036]: E0319 11:53:30.311949 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:30.312570 master-0 kubenswrapper[4036]: E0319 11:53:30.312115 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:31.022747 master-0 kubenswrapper[4036]: I0319 11:53:31.022694 4036 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="c0d30e67fd289a28538ef92cb1f3d25d05768acb2306cf21bc6d90958cdbdf09" exitCode=0 Mar 19 11:53:31.023024 master-0 kubenswrapper[4036]: I0319 11:53:31.022776 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"c0d30e67fd289a28538ef92cb1f3d25d05768acb2306cf21bc6d90958cdbdf09"} Mar 19 11:53:31.027861 master-0 kubenswrapper[4036]: I0319 11:53:31.027812 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a"} Mar 19 11:53:31.048311 master-0 kubenswrapper[4036]: I0319 11:53:31.048207 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" podStartSLOduration=6.897680544 podStartE2EDuration="26.048175292s" podCreationTimestamp="2026-03-19 11:53:05 +0000 UTC" firstStartedPulling="2026-03-19 11:53:07.676198618 +0000 UTC m=+73.228619203" lastFinishedPulling="2026-03-19 11:53:26.826693366 +0000 UTC m=+92.379113951" observedRunningTime="2026-03-19 11:53:27.071164824 +0000 UTC m=+92.623585419" watchObservedRunningTime="2026-03-19 11:53:31.048175292 +0000 UTC m=+96.600595877" Mar 19 11:53:32.033842 master-0 kubenswrapper[4036]: I0319 11:53:32.033754 4036 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="61ceef49c4b3e3c92de2f92ba07df5cc9f3437ed0a986fcdd8cd9a85e9afbe78" exitCode=0 Mar 19 11:53:32.033842 master-0 kubenswrapper[4036]: I0319 11:53:32.033832 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"61ceef49c4b3e3c92de2f92ba07df5cc9f3437ed0a986fcdd8cd9a85e9afbe78"} Mar 19 11:53:32.036252 master-0 kubenswrapper[4036]: I0319 11:53:32.036221 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"52b485f2005783105cdf9d2569a5312d319a7c33f7fd721f2c4707505569003a"} Mar 19 11:53:32.036328 master-0 kubenswrapper[4036]: I0319 11:53:32.036259 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"02574594c73f86272f0873c80450519dc1df034fda7734dbd2138fffd21a3c3b"} Mar 19 11:53:32.072073 master-0 kubenswrapper[4036]: I0319 11:53:32.071957 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-47j68" podStartSLOduration=1.539670927 podStartE2EDuration="21.071926465s" podCreationTimestamp="2026-03-19 11:53:11 +0000 UTC" firstStartedPulling="2026-03-19 11:53:11.475185625 +0000 UTC m=+77.027606210" lastFinishedPulling="2026-03-19 11:53:31.007441163 +0000 UTC m=+96.559861748" observedRunningTime="2026-03-19 11:53:32.071862733 +0000 UTC m=+97.624283338" watchObservedRunningTime="2026-03-19 11:53:32.071926465 +0000 UTC m=+97.624347060" Mar 19 11:53:32.312376 master-0 kubenswrapper[4036]: I0319 11:53:32.312200 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:32.312376 master-0 kubenswrapper[4036]: I0319 11:53:32.312266 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:32.312627 master-0 kubenswrapper[4036]: E0319 11:53:32.312384 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:32.312627 master-0 kubenswrapper[4036]: E0319 11:53:32.312478 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:33.043398 master-0 kubenswrapper[4036]: I0319 11:53:33.043331 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerStarted","Data":"30addaa4982d26bdfb258a1bbf60fe2e386d618f7ada4f3702760789c65504d1"} Mar 19 11:53:33.051316 master-0 kubenswrapper[4036]: I0319 11:53:33.051279 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerStarted","Data":"8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637"} Mar 19 11:53:33.051692 master-0 kubenswrapper[4036]: I0319 11:53:33.051657 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:33.051782 master-0 kubenswrapper[4036]: I0319 11:53:33.051708 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:33.065368 master-0 kubenswrapper[4036]: I0319 11:53:33.065253 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" podStartSLOduration=3.338951239 podStartE2EDuration="40.065221112s" podCreationTimestamp="2026-03-19 11:52:53 +0000 UTC" firstStartedPulling="2026-03-19 11:52:53.541982258 +0000 UTC m=+59.094402843" lastFinishedPulling="2026-03-19 11:53:30.268252141 +0000 UTC m=+95.820672716" observedRunningTime="2026-03-19 11:53:33.061486349 +0000 UTC m=+98.613906944" watchObservedRunningTime="2026-03-19 11:53:33.065221112 +0000 UTC m=+98.617641697" Mar 19 11:53:33.136476 master-0 kubenswrapper[4036]: I0319 11:53:33.136412 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:33.163792 master-0 kubenswrapper[4036]: I0319 11:53:33.163714 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podStartSLOduration=8.815379194 podStartE2EDuration="28.163685278s" podCreationTimestamp="2026-03-19 11:53:05 +0000 UTC" firstStartedPulling="2026-03-19 11:53:07.457656592 +0000 UTC m=+73.010077177" lastFinishedPulling="2026-03-19 11:53:26.805962676 +0000 UTC m=+92.358383261" observedRunningTime="2026-03-19 11:53:33.137962511 +0000 UTC m=+98.690383116" watchObservedRunningTime="2026-03-19 11:53:33.163685278 +0000 UTC m=+98.716105863" Mar 19 11:53:34.060041 master-0 kubenswrapper[4036]: I0319 11:53:34.059973 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:34.086373 master-0 kubenswrapper[4036]: I0319 11:53:34.086301 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:34.312025 master-0 kubenswrapper[4036]: I0319 11:53:34.311871 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:34.312221 master-0 kubenswrapper[4036]: E0319 11:53:34.312054 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:34.312349 master-0 kubenswrapper[4036]: I0319 11:53:34.312309 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:34.312555 master-0 kubenswrapper[4036]: E0319 11:53:34.312529 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:34.850771 master-0 kubenswrapper[4036]: I0319 11:53:34.850716 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tz4qg"] Mar 19 11:53:34.890516 master-0 kubenswrapper[4036]: I0319 11:53:34.890429 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nkd77"] Mar 19 11:53:35.062950 master-0 kubenswrapper[4036]: I0319 11:53:35.062872 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:35.063460 master-0 kubenswrapper[4036]: I0319 11:53:35.062977 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:35.063460 master-0 kubenswrapper[4036]: E0319 11:53:35.063100 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:35.063861 master-0 kubenswrapper[4036]: E0319 11:53:35.063819 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.181419 4036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sr4bh"] Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.181890 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-controller" containerID="cri-o://b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562" gracePeriod=30 Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.181985 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="nbdb" containerID="cri-o://fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1" gracePeriod=30 Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.182045 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-node" containerID="cri-o://c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541" gracePeriod=30 Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.182015 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f" gracePeriod=30 Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.182102 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-acl-logging" containerID="cri-o://c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4" gracePeriod=30 Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.182045 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="northd" containerID="cri-o://c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844" gracePeriod=30 Mar 19 11:53:37.182519 master-0 kubenswrapper[4036]: I0319 11:53:37.182252 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="sbdb" containerID="cri-o://d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a" gracePeriod=30 Mar 19 11:53:37.197236 master-0 kubenswrapper[4036]: I0319 11:53:37.197151 4036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovnkube-controller" containerID="cri-o://8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637" gracePeriod=30 Mar 19 11:53:37.199698 master-0 kubenswrapper[4036]: I0319 11:53:37.199086 4036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovnkube-controller" probeResult="failure" output="" Mar 19 11:53:37.312710 master-0 kubenswrapper[4036]: I0319 11:53:37.312488 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:37.312710 master-0 kubenswrapper[4036]: I0319 11:53:37.312549 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:37.312710 master-0 kubenswrapper[4036]: E0319 11:53:37.312690 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:37.313086 master-0 kubenswrapper[4036]: E0319 11:53:37.312824 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:38.078093 master-0 kubenswrapper[4036]: I0319 11:53:38.077775 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 11:53:38.078562 master-0 kubenswrapper[4036]: I0319 11:53:38.078531 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-node/0.log" Mar 19 11:53:38.078916 master-0 kubenswrapper[4036]: I0319 11:53:38.078883 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-acl-logging/0.log" Mar 19 11:53:38.079549 master-0 kubenswrapper[4036]: I0319 11:53:38.079500 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1" exitCode=0 Mar 19 11:53:38.079549 master-0 kubenswrapper[4036]: I0319 11:53:38.079531 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844" exitCode=0 Mar 19 11:53:38.079549 master-0 kubenswrapper[4036]: I0319 11:53:38.079538 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f" exitCode=143 Mar 19 11:53:38.079549 master-0 kubenswrapper[4036]: I0319 11:53:38.079545 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541" exitCode=143 Mar 19 11:53:38.079549 master-0 kubenswrapper[4036]: I0319 11:53:38.079553 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4" exitCode=143 Mar 19 11:53:38.079755 master-0 kubenswrapper[4036]: I0319 11:53:38.079577 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1"} Mar 19 11:53:38.079755 master-0 kubenswrapper[4036]: I0319 11:53:38.079609 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844"} Mar 19 11:53:38.079755 master-0 kubenswrapper[4036]: I0319 11:53:38.079620 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f"} Mar 19 11:53:38.079755 master-0 kubenswrapper[4036]: I0319 11:53:38.079648 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541"} Mar 19 11:53:38.079755 master-0 kubenswrapper[4036]: I0319 11:53:38.079659 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4"} Mar 19 11:53:38.413686 master-0 kubenswrapper[4036]: I0319 11:53:38.413586 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:53:38.414394 master-0 kubenswrapper[4036]: E0319 11:53:38.413875 4036 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:38.414394 master-0 kubenswrapper[4036]: E0319 11:53:38.414001 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:42.413969044 +0000 UTC m=+167.966389799 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:53:39.087183 master-0 kubenswrapper[4036]: I0319 11:53:39.087015 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 11:53:39.087499 master-0 kubenswrapper[4036]: I0319 11:53:39.087472 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-node/0.log" Mar 19 11:53:39.088227 master-0 kubenswrapper[4036]: I0319 11:53:39.088189 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-acl-logging/0.log" Mar 19 11:53:39.088764 master-0 kubenswrapper[4036]: I0319 11:53:39.088736 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-controller/0.log" Mar 19 11:53:39.089260 master-0 kubenswrapper[4036]: I0319 11:53:39.089206 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a" exitCode=0 Mar 19 11:53:39.089260 master-0 kubenswrapper[4036]: I0319 11:53:39.089239 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562" exitCode=143 Mar 19 11:53:39.089372 master-0 kubenswrapper[4036]: I0319 11:53:39.089278 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a"} Mar 19 11:53:39.089372 master-0 kubenswrapper[4036]: I0319 11:53:39.089319 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562"} Mar 19 11:53:39.312660 master-0 kubenswrapper[4036]: I0319 11:53:39.312378 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:39.312660 master-0 kubenswrapper[4036]: I0319 11:53:39.312385 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:39.312660 master-0 kubenswrapper[4036]: E0319 11:53:39.312530 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:39.312886 master-0 kubenswrapper[4036]: E0319 11:53:39.312812 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:39.692353 master-0 kubenswrapper[4036]: I0319 11:53:39.692298 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovnkube-controller/0.log" Mar 19 11:53:39.693880 master-0 kubenswrapper[4036]: I0319 11:53:39.693846 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 11:53:39.694354 master-0 kubenswrapper[4036]: I0319 11:53:39.694316 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-node/0.log" Mar 19 11:53:39.694774 master-0 kubenswrapper[4036]: I0319 11:53:39.694740 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-acl-logging/0.log" Mar 19 11:53:39.695511 master-0 kubenswrapper[4036]: I0319 11:53:39.695470 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-controller/0.log" Mar 19 11:53:39.695986 master-0 kubenswrapper[4036]: I0319 11:53:39.695953 4036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.828906 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-var-lib-openvswitch\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.828965 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-netd\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829008 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-script-lib\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829060 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-systemd-units\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829085 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm6xn\" (UniqueName: \"kubernetes.io/projected/6e058de7-df45-4cf4-a90a-149f90f891d6-kube-api-access-hm6xn\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829078 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829107 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-node-log\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829225 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-config\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829270 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-env-overrides\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829322 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-log-socket\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829350 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-etc-openvswitch\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829573 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-ovn\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829600 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-kubelet\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829135 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829148 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-node-log" (OuterVolumeSpecName: "node-log") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829179 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.830838 master-0 kubenswrapper[4036]: I0319 11:53:39.829726 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-log-socket" (OuterVolumeSpecName: "log-socket") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.829800 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.829753 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.829834 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e058de7-df45-4cf4-a90a-149f90f891d6-ovn-node-metrics-cert\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.829810 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.829911 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.829871 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-ovn-kubernetes\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830005 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-slash\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830030 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-openvswitch\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830060 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830064 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-slash" (OuterVolumeSpecName: "host-slash") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830063 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830084 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-systemd\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830136 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-netns\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830162 4036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-bin\") pod \"6e058de7-df45-4cf4-a90a-149f90f891d6\" (UID: \"6e058de7-df45-4cf4-a90a-149f90f891d6\") " Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830423 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:39.831452 master-0 kubenswrapper[4036]: I0319 11:53:39.830451 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830431 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830509 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830485 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830708 4036 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830726 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830738 4036 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830827 4036 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830844 4036 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830862 4036 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830875 4036 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830887 4036 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830904 4036 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830917 4036 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830932 4036 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830943 4036 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830956 4036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830968 4036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830979 4036 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-node-log\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.830991 4036 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e058de7-df45-4cf4-a90a-149f90f891d6-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.832030 master-0 kubenswrapper[4036]: I0319 11:53:39.831001 4036 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.835968 master-0 kubenswrapper[4036]: I0319 11:53:39.835899 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e058de7-df45-4cf4-a90a-149f90f891d6-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:53:39.836586 master-0 kubenswrapper[4036]: I0319 11:53:39.836530 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e058de7-df45-4cf4-a90a-149f90f891d6-kube-api-access-hm6xn" (OuterVolumeSpecName: "kube-api-access-hm6xn") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "kube-api-access-hm6xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:53:39.837675 master-0 kubenswrapper[4036]: I0319 11:53:39.837625 4036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6e058de7-df45-4cf4-a90a-149f90f891d6" (UID: "6e058de7-df45-4cf4-a90a-149f90f891d6"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:53:39.931389 master-0 kubenswrapper[4036]: I0319 11:53:39.931285 4036 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.931389 master-0 kubenswrapper[4036]: I0319 11:53:39.931355 4036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm6xn\" (UniqueName: \"kubernetes.io/projected/6e058de7-df45-4cf4-a90a-149f90f891d6-kube-api-access-hm6xn\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.931389 master-0 kubenswrapper[4036]: I0319 11:53:39.931372 4036 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e058de7-df45-4cf4-a90a-149f90f891d6-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:39.931389 master-0 kubenswrapper[4036]: I0319 11:53:39.931382 4036 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e058de7-df45-4cf4-a90a-149f90f891d6-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 19 11:53:40.098433 master-0 kubenswrapper[4036]: I0319 11:53:40.098133 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovnkube-controller/0.log" Mar 19 11:53:40.114022 master-0 kubenswrapper[4036]: I0319 11:53:40.113912 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-ovn-metrics/0.log" Mar 19 11:53:40.114622 master-0 kubenswrapper[4036]: I0319 11:53:40.114591 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/kube-rbac-proxy-node/0.log" Mar 19 11:53:40.115109 master-0 kubenswrapper[4036]: I0319 11:53:40.115077 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-acl-logging/0.log" Mar 19 11:53:40.115772 master-0 kubenswrapper[4036]: I0319 11:53:40.115741 4036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sr4bh_6e058de7-df45-4cf4-a90a-149f90f891d6/ovn-controller/0.log" Mar 19 11:53:40.116212 master-0 kubenswrapper[4036]: I0319 11:53:40.116157 4036 generic.go:334] "Generic (PLEG): container finished" podID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerID="8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637" exitCode=1 Mar 19 11:53:40.116212 master-0 kubenswrapper[4036]: I0319 11:53:40.116202 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637"} Mar 19 11:53:40.116359 master-0 kubenswrapper[4036]: I0319 11:53:40.116234 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" event={"ID":"6e058de7-df45-4cf4-a90a-149f90f891d6","Type":"ContainerDied","Data":"87a26785d0f440b4da4ede0f74f0ae1d183f1c3c8fb96a9d3d4bbedaebbb47ec"} Mar 19 11:53:40.116359 master-0 kubenswrapper[4036]: I0319 11:53:40.116256 4036 scope.go:117] "RemoveContainer" containerID="8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637" Mar 19 11:53:40.116359 master-0 kubenswrapper[4036]: I0319 11:53:40.116329 4036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sr4bh" Mar 19 11:53:40.131913 master-0 kubenswrapper[4036]: I0319 11:53:40.131873 4036 scope.go:117] "RemoveContainer" containerID="d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a" Mar 19 11:53:40.144742 master-0 kubenswrapper[4036]: I0319 11:53:40.144681 4036 scope.go:117] "RemoveContainer" containerID="fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1" Mar 19 11:53:40.155685 master-0 kubenswrapper[4036]: I0319 11:53:40.155657 4036 scope.go:117] "RemoveContainer" containerID="c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844" Mar 19 11:53:40.165927 master-0 kubenswrapper[4036]: I0319 11:53:40.165794 4036 scope.go:117] "RemoveContainer" containerID="1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f" Mar 19 11:53:40.175732 master-0 kubenswrapper[4036]: I0319 11:53:40.175694 4036 scope.go:117] "RemoveContainer" containerID="c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541" Mar 19 11:53:40.185083 master-0 kubenswrapper[4036]: I0319 11:53:40.184829 4036 scope.go:117] "RemoveContainer" containerID="c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4" Mar 19 11:53:40.194733 master-0 kubenswrapper[4036]: I0319 11:53:40.194680 4036 scope.go:117] "RemoveContainer" containerID="b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562" Mar 19 11:53:40.204283 master-0 kubenswrapper[4036]: I0319 11:53:40.204241 4036 scope.go:117] "RemoveContainer" containerID="153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0" Mar 19 11:53:40.434766 master-0 kubenswrapper[4036]: I0319 11:53:40.434627 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:40.435041 master-0 kubenswrapper[4036]: E0319 11:53:40.434914 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 11:53:40.435041 master-0 kubenswrapper[4036]: E0319 11:53:40.434958 4036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 11:53:40.435041 master-0 kubenswrapper[4036]: E0319 11:53:40.434983 4036 projected.go:194] Error preparing data for projected volume kube-api-access-q9qkq for pod openshift-network-diagnostics/network-check-target-tz4qg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:40.435146 master-0 kubenswrapper[4036]: E0319 11:53:40.435091 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq podName:65b629a3-ea10-4517-b762-b18b22fab2a9 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:12.435058453 +0000 UTC m=+137.987479068 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-q9qkq" (UniqueName: "kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq") pod "network-check-target-tz4qg" (UID: "65b629a3-ea10-4517-b762-b18b22fab2a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 11:53:40.767736 master-0 kubenswrapper[4036]: I0319 11:53:40.767086 4036 scope.go:117] "RemoveContainer" containerID="8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637" Mar 19 11:53:40.768553 master-0 kubenswrapper[4036]: E0319 11:53:40.768011 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637\": container with ID starting with 8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637 not found: ID does not exist" containerID="8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637" Mar 19 11:53:40.768553 master-0 kubenswrapper[4036]: I0319 11:53:40.768095 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637"} err="failed to get container status \"8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637\": rpc error: code = NotFound desc = could not find container \"8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637\": container with ID starting with 8da3824ef86b1bd5bd8853633f296f89e412edd13b080a82dc704a802dcb6637 not found: ID does not exist" Mar 19 11:53:40.768553 master-0 kubenswrapper[4036]: I0319 11:53:40.768186 4036 scope.go:117] "RemoveContainer" containerID="d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a" Mar 19 11:53:40.768941 master-0 kubenswrapper[4036]: E0319 11:53:40.768871 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a\": container with ID starting with d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a not found: ID does not exist" containerID="d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a" Mar 19 11:53:40.768941 master-0 kubenswrapper[4036]: I0319 11:53:40.768925 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a"} err="failed to get container status \"d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a\": rpc error: code = NotFound desc = could not find container \"d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a\": container with ID starting with d4f8be2a0cb722a2c7dd408d18457832031f5aeb25f90754bff0b7d9b700ae2a not found: ID does not exist" Mar 19 11:53:40.769139 master-0 kubenswrapper[4036]: I0319 11:53:40.768965 4036 scope.go:117] "RemoveContainer" containerID="fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1" Mar 19 11:53:40.769422 master-0 kubenswrapper[4036]: E0319 11:53:40.769353 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1\": container with ID starting with fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1 not found: ID does not exist" containerID="fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1" Mar 19 11:53:40.769535 master-0 kubenswrapper[4036]: I0319 11:53:40.769409 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1"} err="failed to get container status \"fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1\": rpc error: code = NotFound desc = could not find container \"fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1\": container with ID starting with fb38914a5ecb8977b49fc001984fb4d37b5afd6e96e2fa21315865892b7ddcb1 not found: ID does not exist" Mar 19 11:53:40.769535 master-0 kubenswrapper[4036]: I0319 11:53:40.769456 4036 scope.go:117] "RemoveContainer" containerID="c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844" Mar 19 11:53:40.770001 master-0 kubenswrapper[4036]: E0319 11:53:40.769952 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844\": container with ID starting with c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844 not found: ID does not exist" containerID="c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844" Mar 19 11:53:40.770001 master-0 kubenswrapper[4036]: I0319 11:53:40.769978 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844"} err="failed to get container status \"c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844\": rpc error: code = NotFound desc = could not find container \"c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844\": container with ID starting with c0b0cde864a604b7303ca95c07aaff2e00207d089e417579c66630ed36bbb844 not found: ID does not exist" Mar 19 11:53:40.770001 master-0 kubenswrapper[4036]: I0319 11:53:40.769993 4036 scope.go:117] "RemoveContainer" containerID="1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f" Mar 19 11:53:40.770330 master-0 kubenswrapper[4036]: E0319 11:53:40.770280 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f\": container with ID starting with 1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f not found: ID does not exist" containerID="1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f" Mar 19 11:53:40.770430 master-0 kubenswrapper[4036]: I0319 11:53:40.770333 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f"} err="failed to get container status \"1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f\": rpc error: code = NotFound desc = could not find container \"1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f\": container with ID starting with 1b9e5c22d9999987a7c1082b49b0fd97950086ac228e64588e35cec6db92ad6f not found: ID does not exist" Mar 19 11:53:40.770430 master-0 kubenswrapper[4036]: I0319 11:53:40.770370 4036 scope.go:117] "RemoveContainer" containerID="c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541" Mar 19 11:53:40.770905 master-0 kubenswrapper[4036]: E0319 11:53:40.770858 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541\": container with ID starting with c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541 not found: ID does not exist" containerID="c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541" Mar 19 11:53:40.770905 master-0 kubenswrapper[4036]: I0319 11:53:40.770882 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541"} err="failed to get container status \"c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541\": rpc error: code = NotFound desc = could not find container \"c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541\": container with ID starting with c3d9e23be174a819a2c0bca82048665a036432023f18c5ea49116e129dcfa541 not found: ID does not exist" Mar 19 11:53:40.770905 master-0 kubenswrapper[4036]: I0319 11:53:40.770896 4036 scope.go:117] "RemoveContainer" containerID="c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4" Mar 19 11:53:40.771256 master-0 kubenswrapper[4036]: E0319 11:53:40.771188 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4\": container with ID starting with c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4 not found: ID does not exist" containerID="c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4" Mar 19 11:53:40.771367 master-0 kubenswrapper[4036]: I0319 11:53:40.771247 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4"} err="failed to get container status \"c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4\": rpc error: code = NotFound desc = could not find container \"c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4\": container with ID starting with c860e7d1cafe1159e27f1e049fcff33146e855faecb6c98b90815290286f4da4 not found: ID does not exist" Mar 19 11:53:40.771367 master-0 kubenswrapper[4036]: I0319 11:53:40.771287 4036 scope.go:117] "RemoveContainer" containerID="b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562" Mar 19 11:53:40.771586 master-0 kubenswrapper[4036]: E0319 11:53:40.771556 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562\": container with ID starting with b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562 not found: ID does not exist" containerID="b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562" Mar 19 11:53:40.771586 master-0 kubenswrapper[4036]: I0319 11:53:40.771578 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562"} err="failed to get container status \"b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562\": rpc error: code = NotFound desc = could not find container \"b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562\": container with ID starting with b16b23f73b5fe6c5ae200ba1f4e53a67d331f4489bd6b97c007d7bf9ec8a9562 not found: ID does not exist" Mar 19 11:53:40.771791 master-0 kubenswrapper[4036]: I0319 11:53:40.771595 4036 scope.go:117] "RemoveContainer" containerID="153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0" Mar 19 11:53:40.771875 master-0 kubenswrapper[4036]: E0319 11:53:40.771804 4036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0\": container with ID starting with 153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0 not found: ID does not exist" containerID="153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0" Mar 19 11:53:40.771875 master-0 kubenswrapper[4036]: I0319 11:53:40.771825 4036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0"} err="failed to get container status \"153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0\": rpc error: code = NotFound desc = could not find container \"153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0\": container with ID starting with 153900e0b00673984fd4e970d0b2c86e5007022e9b7ea674706e0ac907e763f0 not found: ID does not exist" Mar 19 11:53:41.312157 master-0 kubenswrapper[4036]: I0319 11:53:41.312086 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:41.312408 master-0 kubenswrapper[4036]: I0319 11:53:41.312254 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:41.312492 master-0 kubenswrapper[4036]: E0319 11:53:41.312400 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:41.312575 master-0 kubenswrapper[4036]: E0319 11:53:41.312526 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373427 4036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sr4bh"] Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373506 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hm6q6"] Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373674 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-node" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373692 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-node" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373704 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373715 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373725 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="nbdb" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373734 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="nbdb" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373747 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="sbdb" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373756 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="sbdb" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373769 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kubecfg-setup" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373778 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kubecfg-setup" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373787 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-controller" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373798 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-controller" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373808 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovnkube-controller" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373817 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovnkube-controller" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373826 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="northd" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373835 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="northd" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: E0319 11:53:42.373847 4036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-acl-logging" Mar 19 11:53:42.373801 master-0 kubenswrapper[4036]: I0319 11:53:42.373857 4036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-acl-logging" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376117 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="nbdb" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376145 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="sbdb" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376154 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-controller" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376164 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="northd" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376181 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376189 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="kube-rbac-proxy-node" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376198 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovn-acl-logging" Mar 19 11:53:42.376329 master-0 kubenswrapper[4036]: I0319 11:53:42.376206 4036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" containerName="ovnkube-controller" Mar 19 11:53:42.384362 master-0 kubenswrapper[4036]: I0319 11:53:42.384068 4036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sr4bh"] Mar 19 11:53:42.384362 master-0 kubenswrapper[4036]: I0319 11:53:42.384262 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.387706 master-0 kubenswrapper[4036]: I0319 11:53:42.387626 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:53:42.388870 master-0 kubenswrapper[4036]: I0319 11:53:42.388835 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:53:42.451428 master-0 kubenswrapper[4036]: I0319 11:53:42.451334 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451428 master-0 kubenswrapper[4036]: I0319 11:53:42.451399 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451428 master-0 kubenswrapper[4036]: I0319 11:53:42.451434 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451428 master-0 kubenswrapper[4036]: I0319 11:53:42.451452 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451480 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451524 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451544 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451569 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451618 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451656 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451675 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451695 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451719 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451786 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.451885 master-0 kubenswrapper[4036]: I0319 11:53:42.451864 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.452175 master-0 kubenswrapper[4036]: I0319 11:53:42.451900 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.452175 master-0 kubenswrapper[4036]: I0319 11:53:42.451940 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.452175 master-0 kubenswrapper[4036]: I0319 11:53:42.451958 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.452175 master-0 kubenswrapper[4036]: I0319 11:53:42.451975 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.452175 master-0 kubenswrapper[4036]: I0319 11:53:42.451992 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552447 master-0 kubenswrapper[4036]: I0319 11:53:42.552362 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552447 master-0 kubenswrapper[4036]: I0319 11:53:42.552435 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552872 master-0 kubenswrapper[4036]: I0319 11:53:42.552586 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552872 master-0 kubenswrapper[4036]: I0319 11:53:42.552680 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552872 master-0 kubenswrapper[4036]: I0319 11:53:42.552712 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552872 master-0 kubenswrapper[4036]: I0319 11:53:42.552728 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552872 master-0 kubenswrapper[4036]: I0319 11:53:42.552798 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.552872 master-0 kubenswrapper[4036]: I0319 11:53:42.552859 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553124 master-0 kubenswrapper[4036]: I0319 11:53:42.552903 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553124 master-0 kubenswrapper[4036]: I0319 11:53:42.552931 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553124 master-0 kubenswrapper[4036]: I0319 11:53:42.552965 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553124 master-0 kubenswrapper[4036]: I0319 11:53:42.553001 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553124 master-0 kubenswrapper[4036]: I0319 11:53:42.553029 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553124 master-0 kubenswrapper[4036]: I0319 11:53:42.553058 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553334 master-0 kubenswrapper[4036]: I0319 11:53:42.553071 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553334 master-0 kubenswrapper[4036]: I0319 11:53:42.553266 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553334 master-0 kubenswrapper[4036]: I0319 11:53:42.553309 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553447 master-0 kubenswrapper[4036]: I0319 11:53:42.553351 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553447 master-0 kubenswrapper[4036]: I0319 11:53:42.553379 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553447 master-0 kubenswrapper[4036]: I0319 11:53:42.553410 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553447 master-0 kubenswrapper[4036]: I0319 11:53:42.553443 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553596 master-0 kubenswrapper[4036]: I0319 11:53:42.553476 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553596 master-0 kubenswrapper[4036]: I0319 11:53:42.553510 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553596 master-0 kubenswrapper[4036]: I0319 11:53:42.553536 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553596 master-0 kubenswrapper[4036]: I0319 11:53:42.553571 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553596 master-0 kubenswrapper[4036]: I0319 11:53:42.553597 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553816 master-0 kubenswrapper[4036]: I0319 11:53:42.553682 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553816 master-0 kubenswrapper[4036]: I0319 11:53:42.553732 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553816 master-0 kubenswrapper[4036]: I0319 11:53:42.553776 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553816 master-0 kubenswrapper[4036]: I0319 11:53:42.553814 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553962 master-0 kubenswrapper[4036]: I0319 11:53:42.553847 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553962 master-0 kubenswrapper[4036]: I0319 11:53:42.553881 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553962 master-0 kubenswrapper[4036]: I0319 11:53:42.553097 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.553962 master-0 kubenswrapper[4036]: I0319 11:53:42.553119 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.554106 master-0 kubenswrapper[4036]: I0319 11:53:42.553978 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.554106 master-0 kubenswrapper[4036]: I0319 11:53:42.554035 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.554819 master-0 kubenswrapper[4036]: I0319 11:53:42.554781 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.554819 master-0 kubenswrapper[4036]: I0319 11:53:42.554800 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.557022 master-0 kubenswrapper[4036]: I0319 11:53:42.556970 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.573329 master-0 kubenswrapper[4036]: I0319 11:53:42.573292 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.712150 master-0 kubenswrapper[4036]: I0319 11:53:42.712073 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:42.726511 master-0 kubenswrapper[4036]: W0319 11:53:42.726472 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode617e515_a899_41e8_9961_df68e49da4d3.slice/crio-33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae WatchSource:0}: Error finding container 33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae: Status 404 returned error can't find the container with id 33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae Mar 19 11:53:43.127674 master-0 kubenswrapper[4036]: I0319 11:53:43.127481 4036 generic.go:334] "Generic (PLEG): container finished" podID="e617e515-a899-41e8-9961-df68e49da4d3" containerID="ffac3bdf0b92a0122fbb8d6a02eb01304705a3c19a78b9c431bf0840fcd9389f" exitCode=0 Mar 19 11:53:43.127674 master-0 kubenswrapper[4036]: I0319 11:53:43.127541 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerDied","Data":"ffac3bdf0b92a0122fbb8d6a02eb01304705a3c19a78b9c431bf0840fcd9389f"} Mar 19 11:53:43.127674 master-0 kubenswrapper[4036]: I0319 11:53:43.127586 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae"} Mar 19 11:53:43.312072 master-0 kubenswrapper[4036]: I0319 11:53:43.311897 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:43.312163 master-0 kubenswrapper[4036]: I0319 11:53:43.311988 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:43.312211 master-0 kubenswrapper[4036]: E0319 11:53:43.312167 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:43.312317 master-0 kubenswrapper[4036]: E0319 11:53:43.312273 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:43.315776 master-0 kubenswrapper[4036]: I0319 11:53:43.315742 4036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e058de7-df45-4cf4-a90a-149f90f891d6" path="/var/lib/kubelet/pods/6e058de7-df45-4cf4-a90a-149f90f891d6/volumes" Mar 19 11:53:44.136757 master-0 kubenswrapper[4036]: I0319 11:53:44.136682 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"6eb559eb0a3ed0b8b690c290eaee82218aa692c28e9cd60b46d871d2d5f31da3"} Mar 19 11:53:44.136757 master-0 kubenswrapper[4036]: I0319 11:53:44.136745 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"894668843cc379ef3af14af02bd8810c94cf8d250ae2116e3028573643bd5152"} Mar 19 11:53:44.136757 master-0 kubenswrapper[4036]: I0319 11:53:44.136757 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"fbffa3403c0c76ed26f94a9db8da7c5c7eba0d12a27cdac70fa8e9b94ea5d1ad"} Mar 19 11:53:44.136757 master-0 kubenswrapper[4036]: I0319 11:53:44.136766 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"77d9ff79f7e20893fe69fd62607ea644dde99bd686f4f6e583b7ceec2d7aec30"} Mar 19 11:53:44.136757 master-0 kubenswrapper[4036]: I0319 11:53:44.136777 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"2ed0f3afa93a58287d466e23ce923764ef467afba9f298045ac789afa5a06a3a"} Mar 19 11:53:44.136757 master-0 kubenswrapper[4036]: I0319 11:53:44.136787 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"78ffa2b3d97ad72d0c1fb4664edd94ccca3e077568d34392a2813e8d87310b58"} Mar 19 11:53:45.313057 master-0 kubenswrapper[4036]: I0319 11:53:45.312947 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:45.313057 master-0 kubenswrapper[4036]: I0319 11:53:45.313047 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:45.314198 master-0 kubenswrapper[4036]: E0319 11:53:45.313241 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:45.314198 master-0 kubenswrapper[4036]: E0319 11:53:45.313502 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:46.150349 master-0 kubenswrapper[4036]: I0319 11:53:46.150303 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"70ee74642d29e1b42d7e6fc003b0e3b2856a367fa6f34938ca3ae34c1becb9ec"} Mar 19 11:53:47.312340 master-0 kubenswrapper[4036]: I0319 11:53:47.312268 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:47.313178 master-0 kubenswrapper[4036]: I0319 11:53:47.312368 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:47.313178 master-0 kubenswrapper[4036]: E0319 11:53:47.312483 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:47.313178 master-0 kubenswrapper[4036]: E0319 11:53:47.312599 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:48.162237 master-0 kubenswrapper[4036]: I0319 11:53:48.160643 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"563b304ab3f329515fc5902dab431b332849f6c45ef58e2a8305deced768a9eb"} Mar 19 11:53:48.162237 master-0 kubenswrapper[4036]: I0319 11:53:48.161208 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:48.162237 master-0 kubenswrapper[4036]: I0319 11:53:48.161229 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:48.190673 master-0 kubenswrapper[4036]: I0319 11:53:48.186973 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:48.380452 master-0 kubenswrapper[4036]: I0319 11:53:48.378752 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" podStartSLOduration=6.378726217 podStartE2EDuration="6.378726217s" podCreationTimestamp="2026-03-19 11:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:48.356565178 +0000 UTC m=+113.908985773" watchObservedRunningTime="2026-03-19 11:53:48.378726217 +0000 UTC m=+113.931146792" Mar 19 11:53:49.164052 master-0 kubenswrapper[4036]: I0319 11:53:49.163956 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:49.192613 master-0 kubenswrapper[4036]: I0319 11:53:49.192536 4036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:53:49.312102 master-0 kubenswrapper[4036]: I0319 11:53:49.312051 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:49.312208 master-0 kubenswrapper[4036]: I0319 11:53:49.312072 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:49.312310 master-0 kubenswrapper[4036]: E0319 11:53:49.312248 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:49.312346 master-0 kubenswrapper[4036]: E0319 11:53:49.312315 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:51.312913 master-0 kubenswrapper[4036]: I0319 11:53:51.312773 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:51.312913 master-0 kubenswrapper[4036]: I0319 11:53:51.312846 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:51.313913 master-0 kubenswrapper[4036]: E0319 11:53:51.312977 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tz4qg" podUID="65b629a3-ea10-4517-b762-b18b22fab2a9" Mar 19 11:53:51.313913 master-0 kubenswrapper[4036]: E0319 11:53:51.313135 4036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nkd77" podUID="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" Mar 19 11:53:52.618775 master-0 kubenswrapper[4036]: I0319 11:53:52.618625 4036 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 19 11:53:52.619559 master-0 kubenswrapper[4036]: I0319 11:53:52.619539 4036 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 19 11:53:52.661775 master-0 kubenswrapper[4036]: I0319 11:53:52.660448 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498"] Mar 19 11:53:52.661775 master-0 kubenswrapper[4036]: I0319 11:53:52.661206 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.667307 master-0 kubenswrapper[4036]: I0319 11:53:52.667183 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 11:53:52.667783 master-0 kubenswrapper[4036]: I0319 11:53:52.667719 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht"] Mar 19 11:53:52.668120 master-0 kubenswrapper[4036]: I0319 11:53:52.668070 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 11:53:52.668120 master-0 kubenswrapper[4036]: I0319 11:53:52.668092 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj"] Mar 19 11:53:52.668337 master-0 kubenswrapper[4036]: I0319 11:53:52.668190 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.668566 master-0 kubenswrapper[4036]: I0319 11:53:52.668478 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.669046 master-0 kubenswrapper[4036]: I0319 11:53:52.669007 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 11:53:52.669046 master-0 kubenswrapper[4036]: I0319 11:53:52.669029 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.669395 master-0 kubenswrapper[4036]: I0319 11:53:52.669358 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 11:53:52.672984 master-0 kubenswrapper[4036]: I0319 11:53:52.672743 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b"] Mar 19 11:53:52.673249 master-0 kubenswrapper[4036]: I0319 11:53:52.673187 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:52.678783 master-0 kubenswrapper[4036]: I0319 11:53:52.678258 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf"] Mar 19 11:53:52.679683 master-0 kubenswrapper[4036]: I0319 11:53:52.679601 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr"] Mar 19 11:53:52.680417 master-0 kubenswrapper[4036]: I0319 11:53:52.680375 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9"] Mar 19 11:53:52.681203 master-0 kubenswrapper[4036]: I0319 11:53:52.681160 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.682250 master-0 kubenswrapper[4036]: I0319 11:53:52.682214 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:52.686694 master-0 kubenswrapper[4036]: I0319 11:53:52.683331 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.692334 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.692815 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.693265 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.693520 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.693818 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.694146 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.694558 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.694857 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.695085 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.695321 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.696413 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.696608 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.697213 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.697424 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.697693 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.697912 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.698123 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.698391 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.693823 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.699461 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb"] Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.700081 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-59ptg"] Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.700113 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.700595 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg"] Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.701133 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.701981 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.703358 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.703827 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4"] Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.704199 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.704314 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 11:53:52.718600 master-0 kubenswrapper[4036]: I0319 11:53:52.706234 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6"] Mar 19 11:53:52.721324 master-0 kubenswrapper[4036]: I0319 11:53:52.708113 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 11:53:52.728841 master-0 kubenswrapper[4036]: I0319 11:53:52.726736 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5"] Mar 19 11:53:52.734035 master-0 kubenswrapper[4036]: I0319 11:53:52.733949 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.734358 master-0 kubenswrapper[4036]: I0319 11:53:52.734285 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 11:53:52.735191 master-0 kubenswrapper[4036]: I0319 11:53:52.735153 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 11:53:52.735702 master-0 kubenswrapper[4036]: I0319 11:53:52.735608 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 11:53:52.736090 master-0 kubenswrapper[4036]: I0319 11:53:52.735279 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.736467 master-0 kubenswrapper[4036]: I0319 11:53:52.735172 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv"] Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.736682 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.736811 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-4h6xn"] Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.736855 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.737135 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.737185 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.737250 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.737359 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.737489 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.737499 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.738601 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.738617 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.738784 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.738802 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.739087 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.739353 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.739459 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.740502 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g"] Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.740827 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.741271 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.741345 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b"] Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.742093 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:52.743023 master-0 kubenswrapper[4036]: I0319 11:53:52.742899 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 11:53:52.745318 master-0 kubenswrapper[4036]: I0319 11:53:52.743354 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 11:53:52.745318 master-0 kubenswrapper[4036]: I0319 11:53:52.743548 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-m5dt4"] Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.747605 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.747942 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.748115 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.748254 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.748728 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.748854 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.748947 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749047 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749129 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749222 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749324 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749433 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749578 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.749728 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 11:53:52.750838 master-0 kubenswrapper[4036]: I0319 11:53:52.750355 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:53:52.752103 master-0 kubenswrapper[4036]: I0319 11:53:52.751282 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 11:53:52.752320 master-0 kubenswrapper[4036]: I0319 11:53:52.752270 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.755044 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.755535 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj"] Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.755989 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh"] Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.756338 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn"] Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.756975 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.757015 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.757200 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:52.761688 master-0 kubenswrapper[4036]: I0319 11:53:52.757592 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.763351 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.764240 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.764300 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.764253 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.764582 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.764881 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.765318 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw"] Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.765945 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:52.766723 master-0 kubenswrapper[4036]: I0319 11:53:52.766259 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp"] Mar 19 11:53:52.770748 master-0 kubenswrapper[4036]: I0319 11:53:52.770701 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 11:53:52.775731 master-0 kubenswrapper[4036]: I0319 11:53:52.771721 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:53:52.775731 master-0 kubenswrapper[4036]: I0319 11:53:52.772294 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 11:53:52.775731 master-0 kubenswrapper[4036]: I0319 11:53:52.773402 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:53:52.775731 master-0 kubenswrapper[4036]: I0319 11:53:52.774026 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 11:53:52.777253 master-0 kubenswrapper[4036]: I0319 11:53:52.777203 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.793228 master-0 kubenswrapper[4036]: I0319 11:53:52.792359 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.793228 master-0 kubenswrapper[4036]: I0319 11:53:52.792742 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 11:53:52.793228 master-0 kubenswrapper[4036]: I0319 11:53:52.792853 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 11:53:52.793228 master-0 kubenswrapper[4036]: I0319 11:53:52.792958 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 11:53:52.793535 master-0 kubenswrapper[4036]: I0319 11:53:52.793288 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 11:53:52.797709 master-0 kubenswrapper[4036]: I0319 11:53:52.794900 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 11:53:52.797709 master-0 kubenswrapper[4036]: I0319 11:53:52.795137 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498"] Mar 19 11:53:52.797709 master-0 kubenswrapper[4036]: I0319 11:53:52.795180 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht"] Mar 19 11:53:52.797709 master-0 kubenswrapper[4036]: I0319 11:53:52.795192 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr"] Mar 19 11:53:52.797709 master-0 kubenswrapper[4036]: I0319 11:53:52.795868 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 11:53:52.798070 master-0 kubenswrapper[4036]: I0319 11:53:52.797748 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf"] Mar 19 11:53:52.798070 master-0 kubenswrapper[4036]: I0319 11:53:52.797784 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj"] Mar 19 11:53:52.798070 master-0 kubenswrapper[4036]: I0319 11:53:52.797795 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6"] Mar 19 11:53:52.799077 master-0 kubenswrapper[4036]: I0319 11:53:52.798987 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh"] Mar 19 11:53:52.799946 master-0 kubenswrapper[4036]: I0319 11:53:52.799887 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4"] Mar 19 11:53:52.800401 master-0 kubenswrapper[4036]: I0319 11:53:52.800363 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-59ptg"] Mar 19 11:53:52.801102 master-0 kubenswrapper[4036]: I0319 11:53:52.801060 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp"] Mar 19 11:53:52.803706 master-0 kubenswrapper[4036]: I0319 11:53:52.802513 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb"] Mar 19 11:53:52.803706 master-0 kubenswrapper[4036]: I0319 11:53:52.803581 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b"] Mar 19 11:53:52.804756 master-0 kubenswrapper[4036]: I0319 11:53:52.804717 4036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-j6n2j"] Mar 19 11:53:52.805290 master-0 kubenswrapper[4036]: I0319 11:53:52.805257 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:52.805795 master-0 kubenswrapper[4036]: I0319 11:53:52.805745 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-4h6xn"] Mar 19 11:53:52.807110 master-0 kubenswrapper[4036]: I0319 11:53:52.806984 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-m5dt4"] Mar 19 11:53:52.807568 master-0 kubenswrapper[4036]: I0319 11:53:52.807548 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 11:53:52.807625 master-0 kubenswrapper[4036]: I0319 11:53:52.807564 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj"] Mar 19 11:53:52.808820 master-0 kubenswrapper[4036]: I0319 11:53:52.808783 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5"] Mar 19 11:53:52.809901 master-0 kubenswrapper[4036]: I0319 11:53:52.809863 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9"] Mar 19 11:53:52.811802 master-0 kubenswrapper[4036]: I0319 11:53:52.811755 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg"] Mar 19 11:53:52.813244 master-0 kubenswrapper[4036]: I0319 11:53:52.813193 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw"] Mar 19 11:53:52.814338 master-0 kubenswrapper[4036]: I0319 11:53:52.814260 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g"] Mar 19 11:53:52.815419 master-0 kubenswrapper[4036]: I0319 11:53:52.815372 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn"] Mar 19 11:53:52.816127 master-0 kubenswrapper[4036]: I0319 11:53:52.816070 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv"] Mar 19 11:53:52.817533 master-0 kubenswrapper[4036]: I0319 11:53:52.817483 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b"] Mar 19 11:53:52.857684 master-0 kubenswrapper[4036]: I0319 11:53:52.857616 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.857860 master-0 kubenswrapper[4036]: I0319 11:53:52.857833 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.858086 master-0 kubenswrapper[4036]: I0319 11:53:52.858067 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.858187 master-0 kubenswrapper[4036]: I0319 11:53:52.858173 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:52.858382 master-0 kubenswrapper[4036]: I0319 11:53:52.858365 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.858480 master-0 kubenswrapper[4036]: I0319 11:53:52.858466 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.858566 master-0 kubenswrapper[4036]: I0319 11:53:52.858545 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.858656 master-0 kubenswrapper[4036]: I0319 11:53:52.858625 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.859028 master-0 kubenswrapper[4036]: I0319 11:53:52.858741 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.859202 master-0 kubenswrapper[4036]: I0319 11:53:52.859185 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.859379 master-0 kubenswrapper[4036]: I0319 11:53:52.859363 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.859463 master-0 kubenswrapper[4036]: I0319 11:53:52.859449 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.859563 master-0 kubenswrapper[4036]: I0319 11:53:52.859545 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.859691 master-0 kubenswrapper[4036]: I0319 11:53:52.859670 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.859786 master-0 kubenswrapper[4036]: I0319 11:53:52.859773 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:52.859879 master-0 kubenswrapper[4036]: I0319 11:53:52.859863 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.860060 master-0 kubenswrapper[4036]: I0319 11:53:52.860040 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.860150 master-0 kubenswrapper[4036]: I0319 11:53:52.860136 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.860350 master-0 kubenswrapper[4036]: I0319 11:53:52.860287 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.860551 master-0 kubenswrapper[4036]: I0319 11:53:52.860536 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.860634 master-0 kubenswrapper[4036]: I0319 11:53:52.860617 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.860801 master-0 kubenswrapper[4036]: I0319 11:53:52.860787 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.860879 master-0 kubenswrapper[4036]: I0319 11:53:52.860866 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.860959 master-0 kubenswrapper[4036]: I0319 11:53:52.860947 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.861030 master-0 kubenswrapper[4036]: I0319 11:53:52.861018 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.861113 master-0 kubenswrapper[4036]: I0319 11:53:52.861099 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.861198 master-0 kubenswrapper[4036]: I0319 11:53:52.861184 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.861270 master-0 kubenswrapper[4036]: I0319 11:53:52.861257 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.861480 master-0 kubenswrapper[4036]: I0319 11:53:52.861467 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.861565 master-0 kubenswrapper[4036]: I0319 11:53:52.861550 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.861642 master-0 kubenswrapper[4036]: I0319 11:53:52.861627 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:52.861755 master-0 kubenswrapper[4036]: I0319 11:53:52.861732 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.861891 master-0 kubenswrapper[4036]: I0319 11:53:52.861873 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:52.861999 master-0 kubenswrapper[4036]: I0319 11:53:52.861983 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.862102 master-0 kubenswrapper[4036]: I0319 11:53:52.862086 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.862199 master-0 kubenswrapper[4036]: I0319 11:53:52.862183 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.862301 master-0 kubenswrapper[4036]: I0319 11:53:52.862284 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.862428 master-0 kubenswrapper[4036]: I0319 11:53:52.862409 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.862545 master-0 kubenswrapper[4036]: I0319 11:53:52.862528 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.862663 master-0 kubenswrapper[4036]: I0319 11:53:52.862628 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:52.862785 master-0 kubenswrapper[4036]: I0319 11:53:52.862767 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.862900 master-0 kubenswrapper[4036]: I0319 11:53:52.862864 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.863005 master-0 kubenswrapper[4036]: I0319 11:53:52.862988 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.863098 master-0 kubenswrapper[4036]: I0319 11:53:52.863083 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.863675 master-0 kubenswrapper[4036]: I0319 11:53:52.863175 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.864471 master-0 kubenswrapper[4036]: I0319 11:53:52.864453 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.864630 master-0 kubenswrapper[4036]: I0319 11:53:52.864610 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.865010 master-0 kubenswrapper[4036]: I0319 11:53:52.864775 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.865258 master-0 kubenswrapper[4036]: I0319 11:53:52.865242 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:52.865376 master-0 kubenswrapper[4036]: I0319 11:53:52.865327 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:52.865500 master-0 kubenswrapper[4036]: I0319 11:53:52.865486 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.865584 master-0 kubenswrapper[4036]: I0319 11:53:52.865572 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.865676 master-0 kubenswrapper[4036]: I0319 11:53:52.865663 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.865934 master-0 kubenswrapper[4036]: I0319 11:53:52.865922 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.866009 master-0 kubenswrapper[4036]: I0319 11:53:52.865997 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:52.866072 master-0 kubenswrapper[4036]: I0319 11:53:52.866061 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.866154 master-0 kubenswrapper[4036]: I0319 11:53:52.866142 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.866276 master-0 kubenswrapper[4036]: I0319 11:53:52.866228 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.866380 master-0 kubenswrapper[4036]: I0319 11:53:52.866364 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.866461 master-0 kubenswrapper[4036]: I0319 11:53:52.866449 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.967312 master-0 kubenswrapper[4036]: I0319 11:53:52.967249 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.967312 master-0 kubenswrapper[4036]: I0319 11:53:52.967303 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.967493 master-0 kubenswrapper[4036]: I0319 11:53:52.967332 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:52.967703 master-0 kubenswrapper[4036]: I0319 11:53:52.967572 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.967703 master-0 kubenswrapper[4036]: I0319 11:53:52.967627 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.967703 master-0 kubenswrapper[4036]: I0319 11:53:52.967673 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.967703 master-0 kubenswrapper[4036]: I0319 11:53:52.967702 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:52.967972 master-0 kubenswrapper[4036]: I0319 11:53:52.967906 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:52.967972 master-0 kubenswrapper[4036]: I0319 11:53:52.967962 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.968066 master-0 kubenswrapper[4036]: I0319 11:53:52.967998 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.968066 master-0 kubenswrapper[4036]: I0319 11:53:52.968034 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.968228 master-0 kubenswrapper[4036]: I0319 11:53:52.968113 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:52.968301 master-0 kubenswrapper[4036]: I0319 11:53:52.968239 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.968301 master-0 kubenswrapper[4036]: I0319 11:53:52.968271 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.968301 master-0 kubenswrapper[4036]: I0319 11:53:52.968275 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.968475 master-0 kubenswrapper[4036]: E0319 11:53:52.968408 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:52.969342 master-0 kubenswrapper[4036]: I0319 11:53:52.968445 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.969342 master-0 kubenswrapper[4036]: I0319 11:53:52.969174 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.969478 master-0 kubenswrapper[4036]: E0319 11:53:52.968488 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:52.969478 master-0 kubenswrapper[4036]: E0319 11:53:52.969345 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.469323076 +0000 UTC m=+119.021743651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:52.969478 master-0 kubenswrapper[4036]: I0319 11:53:52.969410 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.969478 master-0 kubenswrapper[4036]: E0319 11:53:52.969444 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.469427699 +0000 UTC m=+119.021848284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:52.969690 master-0 kubenswrapper[4036]: I0319 11:53:52.969500 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.969690 master-0 kubenswrapper[4036]: I0319 11:53:52.969542 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.969690 master-0 kubenswrapper[4036]: I0319 11:53:52.969569 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:53:52.969811 master-0 kubenswrapper[4036]: I0319 11:53:52.969753 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.969811 master-0 kubenswrapper[4036]: I0319 11:53:52.969778 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.969811 master-0 kubenswrapper[4036]: I0319 11:53:52.969794 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.969941 master-0 kubenswrapper[4036]: I0319 11:53:52.969836 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.969941 master-0 kubenswrapper[4036]: E0319 11:53:52.969856 4036 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:52.969941 master-0 kubenswrapper[4036]: I0319 11:53:52.969872 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:52.969941 master-0 kubenswrapper[4036]: E0319 11:53:52.969894 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.469881722 +0000 UTC m=+119.022302307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:53:52.969941 master-0 kubenswrapper[4036]: I0319 11:53:52.969917 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.969941 master-0 kubenswrapper[4036]: E0319 11:53:52.969929 4036 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:52.970227 master-0 kubenswrapper[4036]: E0319 11:53:52.970131 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.470086397 +0000 UTC m=+119.022507202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:52.970227 master-0 kubenswrapper[4036]: I0319 11:53:52.970197 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.970320 master-0 kubenswrapper[4036]: I0319 11:53:52.970267 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:52.970359 master-0 kubenswrapper[4036]: I0319 11:53:52.970317 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:52.970487 master-0 kubenswrapper[4036]: I0319 11:53:52.970454 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:52.970548 master-0 kubenswrapper[4036]: I0319 11:53:52.970491 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.970548 master-0 kubenswrapper[4036]: I0319 11:53:52.970520 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.970685 master-0 kubenswrapper[4036]: I0319 11:53:52.970554 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:52.970734 master-0 kubenswrapper[4036]: E0319 11:53:52.970676 4036 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:52.970788 master-0 kubenswrapper[4036]: E0319 11:53:52.970767 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.470739565 +0000 UTC m=+119.023160370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:53:52.970788 master-0 kubenswrapper[4036]: I0319 11:53:52.970768 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.970886 master-0 kubenswrapper[4036]: I0319 11:53:52.970801 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.970886 master-0 kubenswrapper[4036]: I0319 11:53:52.970850 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.970973 master-0 kubenswrapper[4036]: I0319 11:53:52.970876 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.970973 master-0 kubenswrapper[4036]: I0319 11:53:52.970888 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.970973 master-0 kubenswrapper[4036]: I0319 11:53:52.970965 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:52.971088 master-0 kubenswrapper[4036]: E0319 11:53:52.970987 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:52.971088 master-0 kubenswrapper[4036]: E0319 11:53:52.971031 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.471018643 +0000 UTC m=+119.023439438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:52.971088 master-0 kubenswrapper[4036]: I0319 11:53:52.971010 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.971088 master-0 kubenswrapper[4036]: I0319 11:53:52.971077 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971106 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971136 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971149 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971169 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971197 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971226 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:52.971250 master-0 kubenswrapper[4036]: I0319 11:53:52.971256 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971283 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971291 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971355 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971379 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: E0319 11:53:52.971383 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971406 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: E0319 11:53:52.971436 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.471418494 +0000 UTC m=+119.023839299 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971475 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.971508 master-0 kubenswrapper[4036]: I0319 11:53:52.971515 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971549 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971578 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971608 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971638 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971686 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971742 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971767 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971792 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971822 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.971855 master-0 kubenswrapper[4036]: I0319 11:53:52.971847 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.972120 master-0 kubenswrapper[4036]: I0319 11:53:52.971874 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.972120 master-0 kubenswrapper[4036]: I0319 11:53:52.971900 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.972120 master-0 kubenswrapper[4036]: I0319 11:53:52.971935 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.972343 master-0 kubenswrapper[4036]: I0319 11:53:52.972303 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.972402 master-0 kubenswrapper[4036]: I0319 11:53:52.972361 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.972445 master-0 kubenswrapper[4036]: I0319 11:53:52.972412 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.972508 master-0 kubenswrapper[4036]: I0319 11:53:52.972479 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.973132 master-0 kubenswrapper[4036]: I0319 11:53:52.973099 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:52.975287 master-0 kubenswrapper[4036]: I0319 11:53:52.973127 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.975287 master-0 kubenswrapper[4036]: I0319 11:53:52.974071 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.975287 master-0 kubenswrapper[4036]: I0319 11:53:52.975222 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.975917 master-0 kubenswrapper[4036]: E0319 11:53:52.975788 4036 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:52.975917 master-0 kubenswrapper[4036]: I0319 11:53:52.975847 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:52.975978 master-0 kubenswrapper[4036]: E0319 11:53:52.975917 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.475875196 +0000 UTC m=+119.028295771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:53:52.975978 master-0 kubenswrapper[4036]: I0319 11:53:52.975936 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.976044 master-0 kubenswrapper[4036]: I0319 11:53:52.975977 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:52.976044 master-0 kubenswrapper[4036]: E0319 11:53:52.976018 4036 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:52.976113 master-0 kubenswrapper[4036]: I0319 11:53:52.976058 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:52.976308 master-0 kubenswrapper[4036]: E0319 11:53:52.976274 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.476259147 +0000 UTC m=+119.028679732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:53:52.976354 master-0 kubenswrapper[4036]: I0319 11:53:52.976318 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:52.976354 master-0 kubenswrapper[4036]: I0319 11:53:52.976346 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:52.976522 master-0 kubenswrapper[4036]: I0319 11:53:52.976430 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.976522 master-0 kubenswrapper[4036]: E0319 11:53:52.976465 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:52.976590 master-0 kubenswrapper[4036]: E0319 11:53:52.976528 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.476509574 +0000 UTC m=+119.028930149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:52.976590 master-0 kubenswrapper[4036]: I0319 11:53:52.976497 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.976735 master-0 kubenswrapper[4036]: I0319 11:53:52.976697 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:52.976794 master-0 kubenswrapper[4036]: I0319 11:53:52.976705 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.976839 master-0 kubenswrapper[4036]: I0319 11:53:52.976815 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.976875 master-0 kubenswrapper[4036]: I0319 11:53:52.976847 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:52.976997 master-0 kubenswrapper[4036]: E0319 11:53:52.976951 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:52.977046 master-0 kubenswrapper[4036]: E0319 11:53:52.977020 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.477009358 +0000 UTC m=+119.029429933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:53:52.977046 master-0 kubenswrapper[4036]: I0319 11:53:52.976986 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.977120 master-0 kubenswrapper[4036]: I0319 11:53:52.976965 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:52.977157 master-0 kubenswrapper[4036]: I0319 11:53:52.977129 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:52.977202 master-0 kubenswrapper[4036]: I0319 11:53:52.977179 4036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:52.977238 master-0 kubenswrapper[4036]: I0319 11:53:52.977211 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.977344 master-0 kubenswrapper[4036]: E0319 11:53:52.977141 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:52.977403 master-0 kubenswrapper[4036]: E0319 11:53:52.977373 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.477360087 +0000 UTC m=+119.029780672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:52.977678 master-0 kubenswrapper[4036]: I0319 11:53:52.977596 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:52.978455 master-0 kubenswrapper[4036]: I0319 11:53:52.978416 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:52.978634 master-0 kubenswrapper[4036]: I0319 11:53:52.978595 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.979668 master-0 kubenswrapper[4036]: I0319 11:53:52.979604 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:52.980621 master-0 kubenswrapper[4036]: I0319 11:53:52.980582 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:52.980710 master-0 kubenswrapper[4036]: I0319 11:53:52.980621 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.980801 master-0 kubenswrapper[4036]: I0319 11:53:52.980769 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:52.980999 master-0 kubenswrapper[4036]: I0319 11:53:52.980962 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:52.981452 master-0 kubenswrapper[4036]: I0319 11:53:52.981412 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:53.061576 master-0 kubenswrapper[4036]: I0319 11:53:53.061376 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:53.062092 master-0 kubenswrapper[4036]: I0319 11:53:53.062037 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:53.062773 master-0 kubenswrapper[4036]: I0319 11:53:53.062718 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:53.063591 master-0 kubenswrapper[4036]: I0319 11:53:53.063549 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:53.064010 master-0 kubenswrapper[4036]: I0319 11:53:53.063963 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:53.066538 master-0 kubenswrapper[4036]: I0319 11:53:53.064692 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:53.066538 master-0 kubenswrapper[4036]: I0319 11:53:53.064836 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:53.069793 master-0 kubenswrapper[4036]: I0319 11:53:53.069516 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:53.070788 master-0 kubenswrapper[4036]: I0319 11:53:53.070167 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:53.071250 master-0 kubenswrapper[4036]: I0319 11:53:53.071196 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:53.074097 master-0 kubenswrapper[4036]: I0319 11:53:53.074037 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:53.074553 master-0 kubenswrapper[4036]: I0319 11:53:53.074169 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:53.074553 master-0 kubenswrapper[4036]: I0319 11:53:53.074516 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:53.075508 master-0 kubenswrapper[4036]: I0319 11:53:53.075469 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:53.077896 master-0 kubenswrapper[4036]: I0319 11:53:53.077851 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.078251 master-0 kubenswrapper[4036]: I0319 11:53:53.078130 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.078251 master-0 kubenswrapper[4036]: I0319 11:53:53.078196 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.078468 master-0 kubenswrapper[4036]: I0319 11:53:53.078434 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.078528 master-0 kubenswrapper[4036]: I0319 11:53:53.078498 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:53.078665 master-0 kubenswrapper[4036]: I0319 11:53:53.078571 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.079382 master-0 kubenswrapper[4036]: I0319 11:53:53.078937 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.079382 master-0 kubenswrapper[4036]: I0319 11:53:53.079190 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:53:53.079382 master-0 kubenswrapper[4036]: I0319 11:53:53.079224 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.079382 master-0 kubenswrapper[4036]: I0319 11:53:53.079375 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:53.079722 master-0 kubenswrapper[4036]: I0319 11:53:53.079416 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.079722 master-0 kubenswrapper[4036]: I0319 11:53:53.079455 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.079722 master-0 kubenswrapper[4036]: I0319 11:53:53.079487 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.079722 master-0 kubenswrapper[4036]: I0319 11:53:53.079509 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.079722 master-0 kubenswrapper[4036]: I0319 11:53:53.079547 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.080317 master-0 kubenswrapper[4036]: E0319 11:53:53.079737 4036 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:53.080317 master-0 kubenswrapper[4036]: E0319 11:53:53.079784 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:53:53.579768641 +0000 UTC m=+119.132189226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:53:53.080317 master-0 kubenswrapper[4036]: I0319 11:53:53.079818 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.080770 master-0 kubenswrapper[4036]: I0319 11:53:53.080425 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.080770 master-0 kubenswrapper[4036]: I0319 11:53:53.080539 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.081298 master-0 kubenswrapper[4036]: I0319 11:53:53.081229 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.081676 master-0 kubenswrapper[4036]: I0319 11:53:53.081560 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.081951 master-0 kubenswrapper[4036]: I0319 11:53:53.081877 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.082293 master-0 kubenswrapper[4036]: I0319 11:53:53.082261 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:53.084028 master-0 kubenswrapper[4036]: I0319 11:53:53.083376 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.084372 master-0 kubenswrapper[4036]: I0319 11:53:53.084303 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.101059 master-0 kubenswrapper[4036]: I0319 11:53:53.100989 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:53.101338 master-0 kubenswrapper[4036]: I0319 11:53:53.101239 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:53:53.120797 master-0 kubenswrapper[4036]: I0319 11:53:53.120607 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:53.122196 master-0 kubenswrapper[4036]: I0319 11:53:53.122152 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:53:53.142732 master-0 kubenswrapper[4036]: I0319 11:53:53.141073 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:53.142732 master-0 kubenswrapper[4036]: I0319 11:53:53.141408 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:53:53.158209 master-0 kubenswrapper[4036]: I0319 11:53:53.158149 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:53.178622 master-0 kubenswrapper[4036]: I0319 11:53:53.178543 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:53.186005 master-0 kubenswrapper[4036]: I0319 11:53:53.185950 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:53:53.194531 master-0 kubenswrapper[4036]: I0319 11:53:53.194448 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:53:53.230674 master-0 kubenswrapper[4036]: I0319 11:53:53.227986 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:53:53.230674 master-0 kubenswrapper[4036]: I0319 11:53:53.228655 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:53:53.238675 master-0 kubenswrapper[4036]: I0319 11:53:53.233229 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.276758 master-0 kubenswrapper[4036]: I0319 11:53:53.274260 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:53:53.276758 master-0 kubenswrapper[4036]: I0319 11:53:53.274787 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.306213 master-0 kubenswrapper[4036]: I0319 11:53:53.302503 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:53.306213 master-0 kubenswrapper[4036]: I0319 11:53:53.303398 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:53:53.308555 master-0 kubenswrapper[4036]: I0319 11:53:53.308504 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.311819 master-0 kubenswrapper[4036]: I0319 11:53:53.311754 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:53.312044 master-0 kubenswrapper[4036]: I0319 11:53:53.311997 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:53:53.318807 master-0 kubenswrapper[4036]: I0319 11:53:53.316835 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:53:53.322811 master-0 kubenswrapper[4036]: I0319 11:53:53.321695 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:53:53.327099 master-0 kubenswrapper[4036]: I0319 11:53:53.327044 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:53:53.338757 master-0 kubenswrapper[4036]: I0319 11:53:53.338714 4036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.348327 master-0 kubenswrapper[4036]: I0319 11:53:53.348284 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 11:53:53.365352 master-0 kubenswrapper[4036]: I0319 11:53:53.365308 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 11:53:53.404086 master-0 kubenswrapper[4036]: I0319 11:53:53.400413 4036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 11:53:53.411841 master-0 kubenswrapper[4036]: I0319 11:53:53.411787 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht"] Mar 19 11:53:53.445599 master-0 kubenswrapper[4036]: I0319 11:53:53.445210 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9"] Mar 19 11:53:53.489003 master-0 kubenswrapper[4036]: I0319 11:53:53.488924 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:53.489003 master-0 kubenswrapper[4036]: I0319 11:53:53.488985 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:53.489198 master-0 kubenswrapper[4036]: E0319 11:53:53.489101 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:53.489198 master-0 kubenswrapper[4036]: I0319 11:53:53.489148 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:53.489198 master-0 kubenswrapper[4036]: E0319 11:53:53.489160 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489142351 +0000 UTC m=+120.041562936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: I0319 11:53:53.489202 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: E0319 11:53:53.489225 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: I0319 11:53:53.489239 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: E0319 11:53:53.489254 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489244474 +0000 UTC m=+120.041665059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: E0319 11:53:53.489102 4036 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: I0319 11:53:53.489284 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: E0319 11:53:53.489322 4036 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:53.489357 master-0 kubenswrapper[4036]: E0319 11:53:53.489349 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489339217 +0000 UTC m=+120.041759802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: I0319 11:53:53.489319 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489398 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489427 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489418509 +0000 UTC m=+120.041839094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: I0319 11:53:53.489424 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489482 4036 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489508 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489499971 +0000 UTC m=+120.041920556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489526 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489519092 +0000 UTC m=+120.041939677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: I0319 11:53:53.489553 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489578 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489607 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489597824 +0000 UTC m=+120.042018409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: I0319 11:53:53.489609 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: I0319 11:53:53.489672 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: I0319 11:53:53.489703 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:53.492468 master-0 kubenswrapper[4036]: E0319 11:53:53.489939 4036 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.489976 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.489965134 +0000 UTC m=+120.042385719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490058 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490112 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.490101848 +0000 UTC m=+120.042522433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490189 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490218 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.490209481 +0000 UTC m=+120.042630066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490764 4036 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490805 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.490794067 +0000 UTC m=+120.043214652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.490961 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:53.492974 master-0 kubenswrapper[4036]: E0319 11:53:53.491016 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.491004822 +0000 UTC m=+120.043425407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:53.557609 master-0 kubenswrapper[4036]: I0319 11:53:53.556738 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr"] Mar 19 11:53:53.572689 master-0 kubenswrapper[4036]: I0319 11:53:53.571162 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:53:53.583736 master-0 kubenswrapper[4036]: W0319 11:53:53.579205 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4705c8f3_89d4_42f2_ad14_e591c5380b9e.slice/crio-153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e WatchSource:0}: Error finding container 153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e: Status 404 returned error can't find the container with id 153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e Mar 19 11:53:53.591467 master-0 kubenswrapper[4036]: I0319 11:53:53.590932 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:53.591467 master-0 kubenswrapper[4036]: E0319 11:53:53.591202 4036 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:53.591467 master-0 kubenswrapper[4036]: E0319 11:53:53.591260 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:53:54.591240407 +0000 UTC m=+120.143660992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:53:53.610073 master-0 kubenswrapper[4036]: I0319 11:53:53.610018 4036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:53:53.687090 master-0 kubenswrapper[4036]: W0319 11:53:53.687030 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50adfdd3_1d4d_47d2_a35b_cd028d66b4c6.slice/crio-781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43 WatchSource:0}: Error finding container 781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43: Status 404 returned error can't find the container with id 781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43 Mar 19 11:53:53.694126 master-0 kubenswrapper[4036]: I0319 11:53:53.692362 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5"] Mar 19 11:53:53.694126 master-0 kubenswrapper[4036]: I0319 11:53:53.693761 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6"] Mar 19 11:53:53.695408 master-0 kubenswrapper[4036]: I0319 11:53:53.695377 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv"] Mar 19 11:53:53.710500 master-0 kubenswrapper[4036]: I0319 11:53:53.710286 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh"] Mar 19 11:53:53.717543 master-0 kubenswrapper[4036]: W0319 11:53:53.717482 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod367ea4e8_8172_4730_8f26_499ed7c167bb.slice/crio-f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35 WatchSource:0}: Error finding container f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35: Status 404 returned error can't find the container with id f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35 Mar 19 11:53:53.731196 master-0 kubenswrapper[4036]: I0319 11:53:53.731065 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498"] Mar 19 11:53:53.744368 master-0 kubenswrapper[4036]: I0319 11:53:53.744328 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g"] Mar 19 11:53:53.778443 master-0 kubenswrapper[4036]: I0319 11:53:53.778331 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw"] Mar 19 11:53:53.791999 master-0 kubenswrapper[4036]: W0319 11:53:53.791391 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f3c0234_248a_4d6f_9aca_6582a481a911.slice/crio-f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc WatchSource:0}: Error finding container f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc: Status 404 returned error can't find the container with id f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc Mar 19 11:53:53.822570 master-0 kubenswrapper[4036]: I0319 11:53:53.822430 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj"] Mar 19 11:53:53.862017 master-0 kubenswrapper[4036]: I0319 11:53:53.861932 4036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp"] Mar 19 11:53:53.868388 master-0 kubenswrapper[4036]: W0319 11:53:53.868342 4036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ffcc9a_4b01_439a_93e0_0674bdfd32ec.slice/crio-fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768 WatchSource:0}: Error finding container fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768: Status 404 returned error can't find the container with id fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768 Mar 19 11:53:54.182499 master-0 kubenswrapper[4036]: I0319 11:53:54.182318 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerStarted","Data":"d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41"} Mar 19 11:53:54.183380 master-0 kubenswrapper[4036]: I0319 11:53:54.183318 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e"} Mar 19 11:53:54.184508 master-0 kubenswrapper[4036]: I0319 11:53:54.184474 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j6n2j" event={"ID":"64136215-8539-4349-977f-6d59deb132ea","Type":"ContainerStarted","Data":"46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367"} Mar 19 11:53:54.185728 master-0 kubenswrapper[4036]: I0319 11:53:54.185677 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" event={"ID":"99ffcc9a-4b01-439a-93e0-0674bdfd32ec","Type":"ContainerStarted","Data":"fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768"} Mar 19 11:53:54.186893 master-0 kubenswrapper[4036]: I0319 11:53:54.186854 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d"} Mar 19 11:53:54.187959 master-0 kubenswrapper[4036]: I0319 11:53:54.187911 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerStarted","Data":"153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e"} Mar 19 11:53:54.188843 master-0 kubenswrapper[4036]: I0319 11:53:54.188803 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerStarted","Data":"7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8"} Mar 19 11:53:54.190061 master-0 kubenswrapper[4036]: I0319 11:53:54.190020 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerStarted","Data":"4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e"} Mar 19 11:53:54.191292 master-0 kubenswrapper[4036]: I0319 11:53:54.191262 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerStarted","Data":"28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364"} Mar 19 11:53:54.192432 master-0 kubenswrapper[4036]: I0319 11:53:54.192386 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc"} Mar 19 11:53:54.193361 master-0 kubenswrapper[4036]: I0319 11:53:54.193333 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerStarted","Data":"b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b"} Mar 19 11:53:54.195148 master-0 kubenswrapper[4036]: I0319 11:53:54.195085 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerStarted","Data":"ec872b5060e715523dcf7612ce03c05df430a828612dcdbf8f83f43994a2ae84"} Mar 19 11:53:54.195148 master-0 kubenswrapper[4036]: I0319 11:53:54.195127 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerStarted","Data":"f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35"} Mar 19 11:53:54.196256 master-0 kubenswrapper[4036]: I0319 11:53:54.196220 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerStarted","Data":"781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43"} Mar 19 11:53:54.231205 master-0 kubenswrapper[4036]: I0319 11:53:54.231080 4036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" podStartSLOduration=84.231050559 podStartE2EDuration="1m24.231050559s" podCreationTimestamp="2026-03-19 11:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:53:54.224193911 +0000 UTC m=+119.776614506" watchObservedRunningTime="2026-03-19 11:53:54.231050559 +0000 UTC m=+119.783471144" Mar 19 11:53:54.503825 master-0 kubenswrapper[4036]: I0319 11:53:54.503587 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: I0319 11:53:54.504106 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: E0319 11:53:54.503961 4036 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: E0319 11:53:54.504180 4036 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: E0319 11:53:54.504186 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.504168774 +0000 UTC m=+122.056589359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: E0319 11:53:54.504251 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.504235716 +0000 UTC m=+122.056656311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: I0319 11:53:54.504276 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: I0319 11:53:54.504334 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: I0319 11:53:54.504397 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: E0319 11:53:54.504530 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: I0319 11:53:54.504544 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: E0319 11:53:54.504595 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.504576906 +0000 UTC m=+122.056997501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:53:54.504750 master-0 kubenswrapper[4036]: I0319 11:53:54.504628 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:54.505122 master-0 kubenswrapper[4036]: E0319 11:53:54.504687 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:54.505122 master-0 kubenswrapper[4036]: E0319 11:53:54.504976 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.504943436 +0000 UTC m=+122.057364101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:54.505122 master-0 kubenswrapper[4036]: E0319 11:53:54.504790 4036 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:54.505122 master-0 kubenswrapper[4036]: E0319 11:53:54.505121 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.50509443 +0000 UTC m=+122.057515015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:54.505246 master-0 kubenswrapper[4036]: E0319 11:53:54.504819 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:54.505246 master-0 kubenswrapper[4036]: E0319 11:53:54.505178 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.505170812 +0000 UTC m=+122.057591397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:54.505246 master-0 kubenswrapper[4036]: E0319 11:53:54.504841 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:54.505246 master-0 kubenswrapper[4036]: E0319 11:53:54.505212 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.505202993 +0000 UTC m=+122.057623578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: I0319 11:53:54.505710 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: I0319 11:53:54.505767 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: E0319 11:53:54.505782 4036 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: I0319 11:53:54.505789 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: E0319 11:53:54.505812 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.505805089 +0000 UTC m=+122.058225674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: I0319 11:53:54.505848 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: E0319 11:53:54.505856 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:54.505914 master-0 kubenswrapper[4036]: E0319 11:53:54.505887 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.505877861 +0000 UTC m=+122.058298446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: I0319 11:53:54.505915 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: E0319 11:53:54.505937 4036 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: E0319 11:53:54.505996 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.505980414 +0000 UTC m=+122.058401099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: E0319 11:53:54.506039 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: E0319 11:53:54.506062 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: E0319 11:53:54.506092 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.506084447 +0000 UTC m=+122.058505032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:53:54.506349 master-0 kubenswrapper[4036]: E0319 11:53:54.506111 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.506103378 +0000 UTC m=+122.058523963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:54.607569 master-0 kubenswrapper[4036]: I0319 11:53:54.607484 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:54.607897 master-0 kubenswrapper[4036]: E0319 11:53:54.607767 4036 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:54.608032 master-0 kubenswrapper[4036]: E0319 11:53:54.607908 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:53:56.607877014 +0000 UTC m=+122.160297599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:53:56.536512 master-0 kubenswrapper[4036]: I0319 11:53:56.536066 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:53:56.536512 master-0 kubenswrapper[4036]: I0319 11:53:56.536508 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:56.536512 master-0 kubenswrapper[4036]: I0319 11:53:56.536534 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536558 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536587 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536606 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536653 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536678 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536709 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: I0319 11:53:56.536729 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.536882 4036 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.536919 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.537027 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.536942 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.536919964 +0000 UTC m=+126.089340549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.537071 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537048748 +0000 UTC m=+126.089469333 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.537083 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537077979 +0000 UTC m=+126.089498564 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.537120 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.537138 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.53713089 +0000 UTC m=+126.089551475 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:53:56.537736 master-0 kubenswrapper[4036]: E0319 11:53:56.537181 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537217 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537203722 +0000 UTC m=+126.089624477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537261 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537286 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537279084 +0000 UTC m=+126.089699879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537324 4036 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537341 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537335786 +0000 UTC m=+126.089756371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537369 4036 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537385 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537380087 +0000 UTC m=+126.089800672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537426 4036 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537457 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537447139 +0000 UTC m=+126.089867714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: I0319 11:53:56.537500 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: I0319 11:53:56.537530 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537580 4036 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537596 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:53:56.538238 master-0 kubenswrapper[4036]: E0319 11:53:56.537606 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537599723 +0000 UTC m=+126.090020308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:53:56.538580 master-0 kubenswrapper[4036]: E0319 11:53:56.537621 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537611713 +0000 UTC m=+126.090032298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:53:56.538580 master-0 kubenswrapper[4036]: E0319 11:53:56.537663 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:56.538580 master-0 kubenswrapper[4036]: E0319 11:53:56.537684 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.537677495 +0000 UTC m=+126.090098280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:53:56.638338 master-0 kubenswrapper[4036]: I0319 11:53:56.638288 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:53:56.638734 master-0 kubenswrapper[4036]: E0319 11:53:56.638708 4036 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:53:56.638817 master-0 kubenswrapper[4036]: E0319 11:53:56.638796 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:00.638771943 +0000 UTC m=+126.191192538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:53:57.215122 master-0 kubenswrapper[4036]: I0319 11:53:57.214840 4036 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="76315438a4a762dabbb5d7e77ae1a8b1b6d345329e45b5b2c2feae26b7df9765" exitCode=0 Mar 19 11:53:57.215444 master-0 kubenswrapper[4036]: I0319 11:53:57.215161 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerDied","Data":"76315438a4a762dabbb5d7e77ae1a8b1b6d345329e45b5b2c2feae26b7df9765"} Mar 19 11:53:57.219648 master-0 kubenswrapper[4036]: I0319 11:53:57.219577 4036 generic.go:334] "Generic (PLEG): container finished" podID="4705c8f3-89d4-42f2-ad14-e591c5380b9e" containerID="b8135311023b73f1dff2f33a6eb221f0fcb0bdc9e1cd016fb390a5b4192441f6" exitCode=0 Mar 19 11:53:57.219709 master-0 kubenswrapper[4036]: I0319 11:53:57.219677 4036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerDied","Data":"b8135311023b73f1dff2f33a6eb221f0fcb0bdc9e1cd016fb390a5b4192441f6"} Mar 19 11:53:57.854026 master-0 kubenswrapper[4036]: I0319 11:53:57.853955 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:53:57.855883 master-0 kubenswrapper[4036]: I0319 11:53:57.855841 4036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 11:53:57.865030 master-0 kubenswrapper[4036]: E0319 11:53:57.864980 4036 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:53:57.865138 master-0 kubenswrapper[4036]: E0319 11:53:57.865112 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:55:01.865082763 +0000 UTC m=+187.417503358 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:00.588496 master-0 kubenswrapper[4036]: I0319 11:54:00.588424 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:00.588496 master-0 kubenswrapper[4036]: I0319 11:54:00.588494 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:00.589221 master-0 kubenswrapper[4036]: I0319 11:54:00.588553 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:00.589221 master-0 kubenswrapper[4036]: I0319 11:54:00.588583 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:00.589221 master-0 kubenswrapper[4036]: E0319 11:54:00.588604 4036 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:00.589221 master-0 kubenswrapper[4036]: I0319 11:54:00.588628 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:00.589338 master-0 kubenswrapper[4036]: E0319 11:54:00.588698 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.588680367 +0000 UTC m=+134.141100952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:00.589338 master-0 kubenswrapper[4036]: E0319 11:54:00.588742 4036 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:00.589338 master-0 kubenswrapper[4036]: I0319 11:54:00.589300 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:00.589428 master-0 kubenswrapper[4036]: E0319 11:54:00.589343 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589327755 +0000 UTC m=+134.141748340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:00.589428 master-0 kubenswrapper[4036]: E0319 11:54:00.589374 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:00.589428 master-0 kubenswrapper[4036]: I0319 11:54:00.589378 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:00.589428 master-0 kubenswrapper[4036]: E0319 11:54:00.588764 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:00.589428 master-0 kubenswrapper[4036]: E0319 11:54:00.589408 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589398197 +0000 UTC m=+134.141818782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:00.589587 master-0 kubenswrapper[4036]: E0319 11:54:00.589464 4036 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:00.589587 master-0 kubenswrapper[4036]: E0319 11:54:00.589468 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589415448 +0000 UTC m=+134.141836033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:00.589587 master-0 kubenswrapper[4036]: E0319 11:54:00.588870 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:00.589587 master-0 kubenswrapper[4036]: E0319 11:54:00.589540 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589530291 +0000 UTC m=+134.141950876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:00.589587 master-0 kubenswrapper[4036]: I0319 11:54:00.589521 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:00.589587 master-0 kubenswrapper[4036]: I0319 11:54:00.589584 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: E0319 11:54:00.589628 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589592223 +0000 UTC m=+134.142012998 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: E0319 11:54:00.589652 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: E0319 11:54:00.589668 4036 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: I0319 11:54:00.589696 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: E0319 11:54:00.588791 4036 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: E0319 11:54:00.589680 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589673185 +0000 UTC m=+134.142093770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:00.589775 master-0 kubenswrapper[4036]: E0319 11:54:00.589771 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589764077 +0000 UTC m=+134.142184662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: E0319 11:54:00.589805 4036 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: I0319 11:54:00.589815 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: E0319 11:54:00.589842 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589833259 +0000 UTC m=+134.142254034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: I0319 11:54:00.589871 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: E0319 11:54:00.589883 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.58986257 +0000 UTC m=+134.142283335 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: E0319 11:54:00.589885 4036 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: E0319 11:54:00.589943 4036 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:00.589968 master-0 kubenswrapper[4036]: E0319 11:54:00.589963 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589948732 +0000 UTC m=+134.142369337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:00.590173 master-0 kubenswrapper[4036]: E0319 11:54:00.589987 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.589974213 +0000 UTC m=+134.142394988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:00.691413 master-0 kubenswrapper[4036]: I0319 11:54:00.691332 4036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:00.691751 master-0 kubenswrapper[4036]: E0319 11:54:00.691704 4036 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:00.691864 master-0 kubenswrapper[4036]: E0319 11:54:00.691845 4036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:08.691784661 +0000 UTC m=+134.244205246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:01.734011 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 11:54:01.793734 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 11:54:01.794068 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 11:54:01.794542 master-0 systemd[1]: kubelet.service: Consumed 9.947s CPU time. Mar 19 11:54:01.819668 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 11:54:01.944078 master-0 kubenswrapper[7642]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:54:01.944078 master-0 kubenswrapper[7642]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 11:54:01.944078 master-0 kubenswrapper[7642]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:54:01.944078 master-0 kubenswrapper[7642]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:54:01.944078 master-0 kubenswrapper[7642]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 11:54:01.944078 master-0 kubenswrapper[7642]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 11:54:01.945678 master-0 kubenswrapper[7642]: I0319 11:54:01.944202 7642 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 11:54:01.947170 master-0 kubenswrapper[7642]: W0319 11:54:01.947142 7642 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:54:01.947170 master-0 kubenswrapper[7642]: W0319 11:54:01.947161 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:54:01.947170 master-0 kubenswrapper[7642]: W0319 11:54:01.947167 7642 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:54:01.947170 master-0 kubenswrapper[7642]: W0319 11:54:01.947173 7642 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947179 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947185 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947190 7642 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947200 7642 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947204 7642 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947208 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947214 7642 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947219 7642 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947224 7642 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947229 7642 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947233 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947237 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947241 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947246 7642 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947250 7642 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947255 7642 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947259 7642 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947263 7642 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:54:01.947337 master-0 kubenswrapper[7642]: W0319 11:54:01.947269 7642 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947273 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947276 7642 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947280 7642 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947285 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947289 7642 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947293 7642 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947297 7642 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947301 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947306 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947311 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947316 7642 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947320 7642 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947324 7642 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947329 7642 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947335 7642 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947341 7642 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947346 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947351 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947355 7642 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:54:01.947791 master-0 kubenswrapper[7642]: W0319 11:54:01.947360 7642 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947364 7642 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947367 7642 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947371 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947377 7642 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947380 7642 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947384 7642 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947388 7642 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947391 7642 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947395 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947413 7642 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947418 7642 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947422 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947427 7642 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947431 7642 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947435 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947439 7642 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947442 7642 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947446 7642 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947450 7642 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:54:01.948242 master-0 kubenswrapper[7642]: W0319 11:54:01.947453 7642 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947457 7642 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947462 7642 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947466 7642 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947470 7642 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947473 7642 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947477 7642 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947482 7642 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947486 7642 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: W0319 11:54:01.947490 7642 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947571 7642 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947579 7642 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947592 7642 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947597 7642 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947603 7642 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947607 7642 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947614 7642 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947622 7642 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947628 7642 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947650 7642 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947657 7642 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947663 7642 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 11:54:01.948790 master-0 kubenswrapper[7642]: I0319 11:54:01.947668 7642 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947674 7642 flags.go:64] FLAG: --cgroup-root="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947679 7642 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947683 7642 flags.go:64] FLAG: --client-ca-file="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947688 7642 flags.go:64] FLAG: --cloud-config="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947692 7642 flags.go:64] FLAG: --cloud-provider="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947698 7642 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947704 7642 flags.go:64] FLAG: --cluster-domain="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947709 7642 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947715 7642 flags.go:64] FLAG: --config-dir="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947720 7642 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947725 7642 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947744 7642 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947750 7642 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947756 7642 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947762 7642 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947768 7642 flags.go:64] FLAG: --contention-profiling="false" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947774 7642 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947779 7642 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947784 7642 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947789 7642 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947796 7642 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947801 7642 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947806 7642 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947811 7642 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 11:54:01.949280 master-0 kubenswrapper[7642]: I0319 11:54:01.947816 7642 flags.go:64] FLAG: --enable-server="true" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947821 7642 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947828 7642 flags.go:64] FLAG: --event-burst="100" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947832 7642 flags.go:64] FLAG: --event-qps="50" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947837 7642 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947842 7642 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947846 7642 flags.go:64] FLAG: --eviction-hard="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947852 7642 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947856 7642 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947860 7642 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947865 7642 flags.go:64] FLAG: --eviction-soft="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947870 7642 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947875 7642 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947879 7642 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947884 7642 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947888 7642 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947892 7642 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947898 7642 flags.go:64] FLAG: --feature-gates="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947904 7642 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947910 7642 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947915 7642 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947920 7642 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947924 7642 flags.go:64] FLAG: --healthz-port="10248" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947929 7642 flags.go:64] FLAG: --help="false" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947934 7642 flags.go:64] FLAG: --hostname-override="" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947938 7642 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 11:54:01.949943 master-0 kubenswrapper[7642]: I0319 11:54:01.947943 7642 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947948 7642 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947953 7642 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947957 7642 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947962 7642 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947966 7642 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947971 7642 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947976 7642 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947980 7642 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947986 7642 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947991 7642 flags.go:64] FLAG: --kube-reserved="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.947996 7642 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948000 7642 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948005 7642 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948010 7642 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948014 7642 flags.go:64] FLAG: --lock-file="" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948019 7642 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948024 7642 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948028 7642 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948036 7642 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948042 7642 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948047 7642 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948053 7642 flags.go:64] FLAG: --logging-format="text" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948059 7642 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948065 7642 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 11:54:01.950523 master-0 kubenswrapper[7642]: I0319 11:54:01.948070 7642 flags.go:64] FLAG: --manifest-url="" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948075 7642 flags.go:64] FLAG: --manifest-url-header="" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948084 7642 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948089 7642 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948095 7642 flags.go:64] FLAG: --max-pods="110" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948100 7642 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948104 7642 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948109 7642 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948113 7642 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948118 7642 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948122 7642 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948127 7642 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948138 7642 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948142 7642 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948147 7642 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948151 7642 flags.go:64] FLAG: --pod-cidr="" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948156 7642 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948164 7642 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948168 7642 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948173 7642 flags.go:64] FLAG: --pods-per-core="0" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948178 7642 flags.go:64] FLAG: --port="10250" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948183 7642 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948187 7642 flags.go:64] FLAG: --provider-id="" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948191 7642 flags.go:64] FLAG: --qos-reserved="" Mar 19 11:54:01.951139 master-0 kubenswrapper[7642]: I0319 11:54:01.948196 7642 flags.go:64] FLAG: --read-only-port="10255" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948200 7642 flags.go:64] FLAG: --register-node="true" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948204 7642 flags.go:64] FLAG: --register-schedulable="true" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948209 7642 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948218 7642 flags.go:64] FLAG: --registry-burst="10" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948222 7642 flags.go:64] FLAG: --registry-qps="5" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948226 7642 flags.go:64] FLAG: --reserved-cpus="" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948231 7642 flags.go:64] FLAG: --reserved-memory="" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948236 7642 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948241 7642 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948246 7642 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948251 7642 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948256 7642 flags.go:64] FLAG: --runonce="false" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948261 7642 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948266 7642 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948271 7642 flags.go:64] FLAG: --seccomp-default="false" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948275 7642 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948280 7642 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948285 7642 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948289 7642 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948294 7642 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948299 7642 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948303 7642 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948308 7642 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948312 7642 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 11:54:01.951771 master-0 kubenswrapper[7642]: I0319 11:54:01.948317 7642 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948321 7642 flags.go:64] FLAG: --system-cgroups="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948326 7642 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948333 7642 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948337 7642 flags.go:64] FLAG: --tls-cert-file="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948341 7642 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948348 7642 flags.go:64] FLAG: --tls-min-version="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948352 7642 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948356 7642 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948361 7642 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948365 7642 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948370 7642 flags.go:64] FLAG: --v="2" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948376 7642 flags.go:64] FLAG: --version="false" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948395 7642 flags.go:64] FLAG: --vmodule="" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948400 7642 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: I0319 11:54:01.948405 7642 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948509 7642 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948514 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948519 7642 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948524 7642 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948528 7642 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948533 7642 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948536 7642 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:54:01.952607 master-0 kubenswrapper[7642]: W0319 11:54:01.948541 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948545 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948549 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948553 7642 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948557 7642 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948560 7642 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948564 7642 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948568 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948571 7642 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948576 7642 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948581 7642 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948585 7642 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948590 7642 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948594 7642 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948598 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948602 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948605 7642 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948609 7642 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948612 7642 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:54:01.953156 master-0 kubenswrapper[7642]: W0319 11:54:01.948616 7642 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948622 7642 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948626 7642 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948629 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948648 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948653 7642 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948658 7642 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948662 7642 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948666 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948669 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948673 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948677 7642 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948681 7642 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948684 7642 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948688 7642 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948692 7642 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948697 7642 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948702 7642 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948706 7642 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948711 7642 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:54:01.953709 master-0 kubenswrapper[7642]: W0319 11:54:01.948715 7642 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948719 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948722 7642 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948726 7642 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948730 7642 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948734 7642 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948737 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948743 7642 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948747 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948751 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948755 7642 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948758 7642 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948762 7642 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948767 7642 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948771 7642 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948775 7642 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948778 7642 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948782 7642 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948786 7642 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948790 7642 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:54:01.954323 master-0 kubenswrapper[7642]: W0319 11:54:01.948794 7642 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:54:01.954867 master-0 kubenswrapper[7642]: W0319 11:54:01.948798 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:54:01.954867 master-0 kubenswrapper[7642]: W0319 11:54:01.948803 7642 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:54:01.954867 master-0 kubenswrapper[7642]: W0319 11:54:01.948807 7642 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:54:01.954867 master-0 kubenswrapper[7642]: W0319 11:54:01.948811 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:54:01.954867 master-0 kubenswrapper[7642]: W0319 11:54:01.948815 7642 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:54:01.954867 master-0 kubenswrapper[7642]: I0319 11:54:01.948828 7642 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:54:01.957994 master-0 kubenswrapper[7642]: I0319 11:54:01.957922 7642 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 11:54:01.957994 master-0 kubenswrapper[7642]: I0319 11:54:01.957984 7642 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 11:54:01.958102 master-0 kubenswrapper[7642]: W0319 11:54:01.958085 7642 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:54:01.958102 master-0 kubenswrapper[7642]: W0319 11:54:01.958095 7642 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:54:01.958102 master-0 kubenswrapper[7642]: W0319 11:54:01.958100 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:54:01.958102 master-0 kubenswrapper[7642]: W0319 11:54:01.958106 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958110 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958115 7642 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958119 7642 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958138 7642 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958143 7642 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958147 7642 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958151 7642 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958155 7642 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958159 7642 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958163 7642 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958167 7642 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958172 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958176 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958180 7642 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958184 7642 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958187 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958191 7642 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958195 7642 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958199 7642 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:54:01.958202 master-0 kubenswrapper[7642]: W0319 11:54:01.958202 7642 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958206 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958211 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958216 7642 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958220 7642 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958225 7642 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958230 7642 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958235 7642 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958240 7642 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958250 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958254 7642 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958260 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958266 7642 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958271 7642 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958276 7642 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958281 7642 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958286 7642 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958291 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958296 7642 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:54:01.958783 master-0 kubenswrapper[7642]: W0319 11:54:01.958314 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958319 7642 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958323 7642 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958327 7642 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958331 7642 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958335 7642 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958339 7642 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958343 7642 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958347 7642 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958350 7642 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958354 7642 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958358 7642 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958362 7642 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958368 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958372 7642 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958391 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958395 7642 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958399 7642 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958403 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:54:01.959415 master-0 kubenswrapper[7642]: W0319 11:54:01.958406 7642 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958411 7642 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958416 7642 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958420 7642 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958424 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958427 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958431 7642 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958435 7642 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958439 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958443 7642 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958448 7642 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: I0319 11:54:01.958456 7642 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958621 7642 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958643 7642 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958648 7642 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 11:54:01.960021 master-0 kubenswrapper[7642]: W0319 11:54:01.958652 7642 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958656 7642 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958660 7642 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958665 7642 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958669 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958672 7642 feature_gate.go:330] unrecognized feature gate: Example Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958676 7642 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958680 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958684 7642 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958687 7642 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958691 7642 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958696 7642 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958703 7642 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958725 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958731 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958735 7642 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958740 7642 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958744 7642 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958749 7642 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958753 7642 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 11:54:01.960480 master-0 kubenswrapper[7642]: W0319 11:54:01.958757 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958761 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958765 7642 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958769 7642 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958773 7642 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958778 7642 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958801 7642 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958807 7642 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958811 7642 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958815 7642 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958821 7642 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958826 7642 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958830 7642 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958834 7642 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958838 7642 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958842 7642 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958847 7642 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958850 7642 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958854 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958858 7642 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 11:54:01.961106 master-0 kubenswrapper[7642]: W0319 11:54:01.958879 7642 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958883 7642 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958887 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958891 7642 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958896 7642 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958901 7642 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958906 7642 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958909 7642 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958913 7642 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958918 7642 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958924 7642 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958929 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958933 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958937 7642 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958942 7642 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958945 7642 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958965 7642 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958970 7642 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958974 7642 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 11:54:01.961729 master-0 kubenswrapper[7642]: W0319 11:54:01.958977 7642 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.958981 7642 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.958985 7642 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.958989 7642 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.958993 7642 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.958996 7642 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.959000 7642 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.959013 7642 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.959017 7642 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: W0319 11:54:01.959021 7642 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: I0319 11:54:01.959044 7642 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 11:54:01.962344 master-0 kubenswrapper[7642]: I0319 11:54:01.959515 7642 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 11:54:01.963352 master-0 kubenswrapper[7642]: I0319 11:54:01.963315 7642 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 11:54:01.963828 master-0 kubenswrapper[7642]: I0319 11:54:01.963709 7642 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 11:54:01.964146 master-0 kubenswrapper[7642]: I0319 11:54:01.964081 7642 server.go:997] "Starting client certificate rotation" Mar 19 11:54:01.964146 master-0 kubenswrapper[7642]: I0319 11:54:01.964107 7642 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 11:54:01.964672 master-0 kubenswrapper[7642]: I0319 11:54:01.964382 7642 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 06:08:14.089252123 +0000 UTC Mar 19 11:54:01.964672 master-0 kubenswrapper[7642]: I0319 11:54:01.964490 7642 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h14m12.124765188s for next certificate rotation Mar 19 11:54:01.965364 master-0 kubenswrapper[7642]: I0319 11:54:01.965334 7642 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:54:01.969290 master-0 kubenswrapper[7642]: I0319 11:54:01.969220 7642 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:54:01.975905 master-0 kubenswrapper[7642]: I0319 11:54:01.975849 7642 log.go:25] "Validated CRI v1 runtime API" Mar 19 11:54:01.979068 master-0 kubenswrapper[7642]: I0319 11:54:01.979035 7642 log.go:25] "Validated CRI v1 image API" Mar 19 11:54:01.980184 master-0 kubenswrapper[7642]: I0319 11:54:01.980148 7642 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 11:54:01.983747 master-0 kubenswrapper[7642]: I0319 11:54:01.983664 7642 fs.go:135] Filesystem UUIDs: map[2be6375c-780c-4b64-a018-3b2c170f9318:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 19 11:54:01.984167 master-0 kubenswrapper[7642]: I0319 11:54:01.983704 7642 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166/userdata/shm major:0 minor:100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e/userdata/shm major:0 minor:280 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8/userdata/shm major:0 minor:240 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43/userdata/shm major:0 minor:250 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e/userdata/shm major:0 minor:243 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41/userdata/shm major:0 minor:255 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/eb1ab82ad9ee6b742e302a5269a9952a1cd4daf4667abd8007cfbd42d95d64c7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/eb1ab82ad9ee6b742e302a5269a9952a1cd4daf4667abd8007cfbd42d95d64c7/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc/userdata/shm major:0 minor:264 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/kube-api-access-bwp24:{mountpoint:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/kube-api-access-bwp24 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc/volumes/kubernetes.io~projected/kube-api-access-9kf8c:{mountpoint:/var/lib/kubelet/pods/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc/volumes/kubernetes.io~projected/kube-api-access-9kf8c major:0 minor:95 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~projected/kube-api-access-4sprd:{mountpoint:/var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~projected/kube-api-access-4sprd major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~projected/kube-api-access-h95pm:{mountpoint:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~projected/kube-api-access-h95pm major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~projected/kube-api-access-f7qsg:{mountpoint:/var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~projected/kube-api-access-f7qsg major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~projected/kube-api-access-79fmb:{mountpoint:/var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~projected/kube-api-access-79fmb major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~projected/kube-api-access-mgr4k:{mountpoint:/var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~projected/kube-api-access-mgr4k major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~projected/kube-api-access-8cvpt:{mountpoint:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~projected/kube-api-access-8cvpt major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~secret/serving-cert major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~projected/kube-api-access-bssz9:{mountpoint:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~projected/kube-api-access-bssz9 major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~projected/kube-api-access-f7dp9:{mountpoint:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~projected/kube-api-access-f7dp9 major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~projected/kube-api-access-4khlg:{mountpoint:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~projected/kube-api-access-4khlg major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~projected/kube-api-access-j8mds:{mountpoint:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~projected/kube-api-access-j8mds major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5ae2d717-506b-4d8e-803b-689c1cc3557c/volumes/kubernetes.io~projected/kube-api-access-s2v9q:{mountpoint:/var/lib/kubelet/pods/5ae2d717-506b-4d8e-803b-689c1cc3557c/volumes/kubernetes.io~projected/kube-api-access-s2v9q major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~projected/kube-api-access-s98qv:{mountpoint:/var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~projected/kube-api-access-s98qv major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/60f846d9-1b3d-4b9d-b3e0-90a238e02640/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/60f846d9-1b3d-4b9d-b3e0-90a238e02640/volumes/kubernetes.io~projected/kube-api-access major:0 minor:91 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64136215-8539-4349-977f-6d59deb132ea/volumes/kubernetes.io~projected/kube-api-access-cgtrh:{mountpoint:/var/lib/kubelet/pods/64136215-8539-4349-977f-6d59deb132ea/volumes/kubernetes.io~projected/kube-api-access-cgtrh major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~projected/kube-api-access-f99wt:{mountpoint:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~projected/kube-api-access-f99wt major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~projected/kube-api-access-pdmj8:{mountpoint:/var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~projected/kube-api-access-pdmj8 major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~projected/kube-api-access-mm4kn:{mountpoint:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~projected/kube-api-access-mm4kn major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~projected/kube-api-access-k9vzl:{mountpoint:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~projected/kube-api-access-k9vzl major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~projected/kube-api-access-m6rhj:{mountpoint:/var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~projected/kube-api-access-m6rhj major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99ffcc9a-4b01-439a-93e0-0674bdfd32ec/volumes/kubernetes.io~projected/kube-api-access-jd875:{mountpoint:/var/lib/kubelet/pods/99ffcc9a-4b01-439a-93e0-0674bdfd32ec/volumes/kubernetes.io~projected/kube-api-access-jd875 major:0 minor:266 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~projected/kube-api-access-scp9p:{mountpoint:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~projected/kube-api-access-scp9p major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~projected/kube-api-access major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~projected/kube-api-access-w4ms4:{mountpoint:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~projected/kube-api-access-w4ms4 major:0 minor:99 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~projected/kube-api-access major:0 minor:272 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~secret/serving-cert major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c479beae-6d34-4d49-81c8-39f6a53ff6ca/volumes/kubernetes.io~projected/kube-api-access-xxm2c:{mountpoint:/var/lib/kubelet/pods/c479beae-6d34-4d49-81c8-39f6a53ff6ca/volumes/kubernetes.io~projected/kube-api-access-xxm2c major:0 minor:112 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~projected/kube-api-access-rbwcn:{mountpoint:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~projected/kube-api-access-rbwcn major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~projected/kube-api-access-bptbc:{mountpoint:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~projected/kube-api-access-bptbc major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~projected/kube-api-access-r2knz:{mountpoint:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~projected/kube-api-access-r2knz major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/kube-api-access-g4qb9:{mountpoint:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/kube-api-access-g4qb9 major:0 minor:228 fsType:tmpfs blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/d917e377248c30f62f7da9ba62c5fc95eb75fd02efc0e9ced1feff4e6935c296/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/58c9a980766cae75ec98614bc5da33db1e02e496a0a4c2a5900718b4b9cf5fa0/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/6168d5eb391ce981df99b4e66aeb71ec733d83f3b77499c1938e001a319d7d02/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/d7bf2c6bdb9caf036748211c2602e34322b65d26334b1622420475fe24e7ce04/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/ae842a80668b607b2b1d8a6d646dba1b2ff61472b4fbceec573c1789ae7746c7/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/c364db9b5ec2e23a7ca51fd1a45ca28b58092c3f75089af91fc2ea75e25c6865/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/6a08985844c171dd83a137ccdcd2445ade44cb0466c3350af2729cbb0d36e5fe/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/4506c3c2463571055478e81b7e23c733eddc2a262491d1b8139b95082423d7cb/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/51bfdd0036c8f3b785db4a475c8b291d36eb4da16166e28fd94f78e03de156e4/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/0af64f21e44fbc8bc6cd6d0ffadeb1f71f78f1d33abfd3bb076167a88aa7c362/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/6b522640855d5949c032946e3f9e84fb20bfdf5e730db90d04aa19a775de4c1c/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/91bf1ca92dbe7a115314c0d3277cf2f6addda4cf36ef6d925a583d5abd8f25c1/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/9c9660e4788d8dbd8141c36ead481b824e93a8a30d42d08de9d647a8159f36c4/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-164:{mountpoint:/var/lib/containers/storage/overlay/b4d22d66b3f688b30bd49b5431350e3275a0b6d3396643a17769fb69563ce364/merged major:0 minor:164 fsType:overlay blockSize:0} overlay_0-169:{mountpoint:/var/lib/containers/storage/overlay/a7b889d8fcfac359660511099579b38b486f85c74a56c0a490ffd3ddd1b941ca/merged major:0 minor:169 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/86bcfb54d0ebebfcbea84507b3fd849be7f29404f9f3153891ae362329302178/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/62b57305bec267610d814b22089e9af9cebdf78a3be111b2e8c890d3f0fb0a10/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/580bbdbac1fa2341244890abe2ed7ea25ec11cb5b4885d5d984433a580a72c5f/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/6c2fffb27650c94cc1e40da7add4512c301aadfe8e4d39e6d83637f470fe7841/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-190:{mountpoint:/var/lib/containers/storage/overlay/7c31b1e3c0b48b9fc53c43991516321945e9c687c989c580d4a23528a43d8f84/merged major:0 minor:190 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/dd6ad5a77649dd26869152c630b639d6de573f816ec5222888ae35635d58f6ed/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-196:{mountpoint:/var/lib/containers/storage/overlay/46ac5b0c70623c03cbf3cb39afec281d52a9c378cd6f12129adaba4521bcfd1f/merged major:0 minor:196 fsType:overlay blockSize:0} overlay_0-269:{mountpoint:/var/lib/containers/storage/overlay/dd65a49e2ad06abfedb89a9e28c0ae78ca7093ca2804e5d616f7643467b7cce7/merged major:0 minor:269 fsType:overlay blockSize:0} overlay_0-274:{mountpoint:/var/lib/containers/storage/overlay/239d5dcdfcb236056a8fe4c5e78c36e670676fd531f3d21c298046bb8716a606/merged major:0 minor:274 fsType:overlay blockSize:0} overlay_0-276:{mountpoint:/var/lib/containers/storage/overlay/56ab318dc593586f9bacd502746bb9a308126da7b6ec35fa1a977fbd0e17c0a6/merged major:0 minor:276 fsType:overlay blockSize:0} overlay_0-278:{mountpoint:/var/lib/containers/storage/overlay/6c4f67cfa152e398959eacda2c282a56756c8041135a5eac37531bd5a941b905/merged major:0 minor:278 fsType:overlay blockSize:0} overlay_0-284:{mountpoint:/var/lib/containers/storage/overlay/bc67427313d433eba83faf5562a070bb0a493518e9f8af00c6dcc17a19b4021b/merged major:0 minor:284 fsType:overlay blockSize:0} overlay_0-286:{mountpoint:/var/lib/containers/storage/overlay/e509617f123eaa94cb4b2d2431aeba2971cb9d9d05f755477395c05625ef1ece/merged major:0 minor:286 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/a5f6933afcb7994ae9a7c2fb192cad0cf94747d1914805fb0030e87d52323f93/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-290:{mountpoint:/var/lib/containers/storage/overlay/bb5c22f5faee6990a15350904223425aa1afffe899a815721902bdb9ff602bd9/merged major:0 minor:290 fsType:overlay blockSize:0} overlay_0-292:{mountpoint:/var/lib/containers/storage/overlay/3dfbc0973edfc5f12eaa4cd97919b66e3ce90e9da0147a0254d94971ec3359a8/merged major:0 minor:292 fsType:overlay blockSize:0} overlay_0-294:{mountpoint:/var/lib/containers/storage/overlay/edab676349d6c24c4d041e7f04d1efbf7e5f70dd1a338d0934556525b98e4122/merged major:0 minor:294 fsType:overlay blockSize:0} overlay_0-296:{mountpoint:/var/lib/containers/storage/overlay/8621576be6c6580fd7e9ca3bd96e067439c1161f070ffb692604c4ef0df9752f/merged major:0 minor:296 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/3ffc052528896d9a3fb508202bb5311b057562a94985655e80d69eaaf20e3588/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/45b5535f80449197995fc6f7b130eeaedbb7e13b67f5baa50c1b36da6c088a25/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-302:{mountpoint:/var/lib/containers/storage/overlay/d08e47a6c14577bb9e091d973518c756075e89d1ea4a66f0d01502b1420dc84b/merged major:0 minor:302 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/469c21b8cc736098b55ce3dfde2e99af4cc8d296e302803a2616707025d1c1b0/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/62e89d3eee24785e2ab102a777487f62cce232b845747f0fea0eec51737313ba/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/fdb0146dc3d40db67beaf73c7cf14c36c3912aa50521ca38ceefd4305daef3c7/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/a6ffa69cf00d8de322471725025c67c07412734fec01971b150e44b7ed01a323/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/88609a6bfd352d523f192544eacab2c48f18fb47cc9d98a9542afeaa5a644542/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/1d80273b80b7e69497e83871910d4fead5ff1c2badf74614d9df5457664db622/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/edc28b5e74b8f5c8c9e83667dd51b57f7bca129e50b49c9762034696444646cf/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/b93f730db47cd4bebbb0801a736c7e7c9ba119a791bbaa7bad7a736cbf12936c/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/d1d4621c98a8612318c5ee01da90e28f1a29792cb26c213d0d07484e3784a72e/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/fed88be55291171a909364b4f2d78d04e51ea1aa59fe96d2cef4cf67e895b438/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/a421fe3af3bdce536a71f2892873df61c9569673540d2435702e9f7c9cd452b4/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/7da364149ad4b09a783a85512bfa3018961553909f6d439099e6b2bc42d85ccf/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 19 11:54:02.001138 master-0 kubenswrapper[7642]: I0319 11:54:02.000626 7642 manager.go:217] Machine: {Timestamp:2026-03-19 11:54:01.999893568 +0000 UTC m=+0.117334385 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:d9abaffc4eed451b8d7e70f7431f4a5e SystemUUID:d9abaffc-4eed-451b-8d7e-70f7431f4a5e BootID:35523968-247e-452c-b4d2-5f84e7547d23 Filesystems:[{Device:/var/lib/kubelet/pods/99ffcc9a-4b01-439a-93e0-0674bdfd32ec/volumes/kubernetes.io~projected/kube-api-access-jd875 DeviceMajor:0 DeviceMinor:266 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/60f846d9-1b3d-4b9d-b3e0-90a238e02640/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:91 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/kube-api-access-g4qb9 DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~projected/kube-api-access-79fmb DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~projected/kube-api-access-f7qsg DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:272 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-302 DeviceMajor:0 DeviceMinor:302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~projected/kube-api-access-mm4kn DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~projected/kube-api-access-h95pm DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43/userdata/shm DeviceMajor:0 DeviceMinor:250 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e/userdata/shm DeviceMajor:0 DeviceMinor:280 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-296 DeviceMajor:0 DeviceMinor:296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~projected/kube-api-access-bptbc DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~projected/kube-api-access-scp9p DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166/userdata/shm DeviceMajor:0 DeviceMinor:100 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-169 DeviceMajor:0 DeviceMinor:169 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~projected/kube-api-access-s98qv DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-269 DeviceMajor:0 DeviceMinor:269 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~projected/kube-api-access-j8mds DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-276 DeviceMajor:0 DeviceMinor:276 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-290 DeviceMajor:0 DeviceMinor:290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5ae2d717-506b-4d8e-803b-689c1cc3557c/volumes/kubernetes.io~projected/kube-api-access-s2v9q DeviceMajor:0 DeviceMinor:258 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8/userdata/shm DeviceMajor:0 DeviceMinor:240 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~projected/kube-api-access-8cvpt DeviceMajor:0 DeviceMinor:257 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~projected/kube-api-access-4khlg DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-274 DeviceMajor:0 DeviceMinor:274 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/eb1ab82ad9ee6b742e302a5269a9952a1cd4daf4667abd8007cfbd42d95d64c7/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~projected/kube-api-access-r2knz DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc/userdata/shm DeviceMajor:0 DeviceMinor:264 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c479beae-6d34-4d49-81c8-39f6a53ff6ca/volumes/kubernetes.io~projected/kube-api-access-xxm2c DeviceMajor:0 DeviceMinor:112 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/64136215-8539-4349-977f-6d59deb132ea/volumes/kubernetes.io~projected/kube-api-access-cgtrh DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc/volumes/kubernetes.io~projected/kube-api-access-9kf8c DeviceMajor:0 DeviceMinor:95 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~projected/kube-api-access-bssz9 DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~projected/kube-api-access-f99wt DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/kube-api-access-bwp24 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-292 DeviceMajor:0 DeviceMinor:292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-190 DeviceMajor:0 DeviceMinor:190 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~projected/kube-api-access-mgr4k DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~projected/kube-api-access-k9vzl DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~projected/kube-api-access-4sprd DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41/userdata/shm DeviceMajor:0 DeviceMinor:255 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~projected/kube-api-access-pdmj8 DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-164 DeviceMajor:0 DeviceMinor:164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~projected/kube-api-access-rbwcn DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-284 DeviceMajor:0 DeviceMinor:284 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-286 DeviceMajor:0 DeviceMinor:286 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-294 DeviceMajor:0 DeviceMinor:294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-278 DeviceMajor:0 DeviceMinor:278 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~projected/kube-api-access-f7dp9 DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e/userdata/shm DeviceMajor:0 DeviceMinor:243 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~projected/kube-api-access-m6rhj DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~projected/kube-api-access-w4ms4 DeviceMajor:0 DeviceMinor:99 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-196 DeviceMajor:0 DeviceMinor:196 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:153273eefe31c48 MacAddress:e6:e4:95:17:0b:db Speed:10000 Mtu:8900} {Name:28c9e269179e8f2 MacAddress:be:f0:95:e6:83:08 Speed:10000 Mtu:8900} {Name:4cea9c5260823fb MacAddress:fa:f0:04:57:72:5f Speed:10000 Mtu:8900} {Name:7478a36d7c10b34 MacAddress:0e:20:8f:2a:0c:c8 Speed:10000 Mtu:8900} {Name:781654899d52ad0 MacAddress:b6:ed:81:c6:6f:6b Speed:10000 Mtu:8900} {Name:9246f0557504641 MacAddress:2a:bd:55:61:2b:62 Speed:10000 Mtu:8900} {Name:b179219ef24884b MacAddress:da:62:64:19:6d:13 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:f2:de:5d:0a:11:40 Speed:0 Mtu:8900} {Name:d0c18b672eeccf5 MacAddress:56:70:41:75:96:db Speed:10000 Mtu:8900} {Name:d8f0e6a1ec121bb MacAddress:86:3c:51:f2:fc:6c Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:a8:4b:cf Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:ea:ac:db Speed:-1 Mtu:9000} {Name:f66b3d91451352d MacAddress:8e:db:73:e8:d5:76 Speed:10000 Mtu:8900} {Name:f69cbe55fe655a8 MacAddress:d2:b6:07:b1:6f:3d Speed:10000 Mtu:8900} {Name:fc7608745c820a5 MacAddress:16:27:30:38:ba:65 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:4a:5a:35:ca:89:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 11:54:02.001138 master-0 kubenswrapper[7642]: I0319 11:54:02.001101 7642 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 11:54:02.001687 master-0 kubenswrapper[7642]: I0319 11:54:02.001285 7642 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 11:54:02.001729 master-0 kubenswrapper[7642]: I0319 11:54:02.001702 7642 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 11:54:02.001961 master-0 kubenswrapper[7642]: I0319 11:54:02.001891 7642 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 11:54:02.002270 master-0 kubenswrapper[7642]: I0319 11:54:02.001953 7642 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 11:54:02.002332 master-0 kubenswrapper[7642]: I0319 11:54:02.002295 7642 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 11:54:02.002332 master-0 kubenswrapper[7642]: I0319 11:54:02.002306 7642 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 11:54:02.002332 master-0 kubenswrapper[7642]: I0319 11:54:02.002316 7642 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:54:02.002537 master-0 kubenswrapper[7642]: I0319 11:54:02.002340 7642 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 11:54:02.002813 master-0 kubenswrapper[7642]: I0319 11:54:02.002786 7642 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:54:02.002954 master-0 kubenswrapper[7642]: I0319 11:54:02.002930 7642 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 11:54:02.003061 master-0 kubenswrapper[7642]: I0319 11:54:02.003011 7642 kubelet.go:418] "Attempting to sync node with API server" Mar 19 11:54:02.003061 master-0 kubenswrapper[7642]: I0319 11:54:02.003032 7642 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 11:54:02.003061 master-0 kubenswrapper[7642]: I0319 11:54:02.003049 7642 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 11:54:02.003184 master-0 kubenswrapper[7642]: I0319 11:54:02.003069 7642 kubelet.go:324] "Adding apiserver pod source" Mar 19 11:54:02.003272 master-0 kubenswrapper[7642]: I0319 11:54:02.003258 7642 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 11:54:02.008062 master-0 kubenswrapper[7642]: I0319 11:54:02.008022 7642 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 11:54:02.008363 master-0 kubenswrapper[7642]: I0319 11:54:02.008344 7642 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 11:54:02.008761 master-0 kubenswrapper[7642]: I0319 11:54:02.008730 7642 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008884 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008907 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008916 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008924 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008932 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008939 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008947 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008953 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008961 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008968 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008978 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 11:54:02.008975 master-0 kubenswrapper[7642]: I0319 11:54:02.008991 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 11:54:02.009367 master-0 kubenswrapper[7642]: I0319 11:54:02.009021 7642 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 11:54:02.009446 master-0 kubenswrapper[7642]: I0319 11:54:02.009424 7642 server.go:1280] "Started kubelet" Mar 19 11:54:02.011377 master-0 kubenswrapper[7642]: I0319 11:54:02.009541 7642 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 11:54:02.011377 master-0 kubenswrapper[7642]: I0319 11:54:02.009621 7642 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 11:54:02.011377 master-0 kubenswrapper[7642]: I0319 11:54:02.009839 7642 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 11:54:02.011377 master-0 kubenswrapper[7642]: I0319 11:54:02.010333 7642 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 11:54:02.011377 master-0 kubenswrapper[7642]: I0319 11:54:02.011201 7642 server.go:449] "Adding debug handlers to kubelet server" Mar 19 11:54:02.011021 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 11:54:02.016061 master-0 kubenswrapper[7642]: I0319 11:54:02.013284 7642 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 11:54:02.019684 master-0 kubenswrapper[7642]: I0319 11:54:02.019525 7642 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 11:54:02.022817 master-0 kubenswrapper[7642]: I0319 11:54:02.022770 7642 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 11:54:02.022817 master-0 kubenswrapper[7642]: I0319 11:54:02.022816 7642 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 11:54:02.023123 master-0 kubenswrapper[7642]: I0319 11:54:02.023024 7642 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 05:29:07.555527751 +0000 UTC Mar 19 11:54:02.023123 master-0 kubenswrapper[7642]: I0319 11:54:02.023072 7642 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h35m5.532459586s for next certificate rotation Mar 19 11:54:02.024202 master-0 kubenswrapper[7642]: I0319 11:54:02.023248 7642 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 11:54:02.024202 master-0 kubenswrapper[7642]: I0319 11:54:02.023263 7642 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 11:54:02.024202 master-0 kubenswrapper[7642]: I0319 11:54:02.023402 7642 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 11:54:02.025879 master-0 kubenswrapper[7642]: I0319 11:54:02.025843 7642 factory.go:153] Registering CRI-O factory Mar 19 11:54:02.025879 master-0 kubenswrapper[7642]: I0319 11:54:02.025878 7642 factory.go:221] Registration of the crio container factory successfully Mar 19 11:54:02.026066 master-0 kubenswrapper[7642]: I0319 11:54:02.026047 7642 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 11:54:02.026103 master-0 kubenswrapper[7642]: I0319 11:54:02.026066 7642 factory.go:55] Registering systemd factory Mar 19 11:54:02.026103 master-0 kubenswrapper[7642]: I0319 11:54:02.026080 7642 factory.go:221] Registration of the systemd container factory successfully Mar 19 11:54:02.026362 master-0 kubenswrapper[7642]: I0319 11:54:02.026217 7642 factory.go:103] Registering Raw factory Mar 19 11:54:02.026362 master-0 kubenswrapper[7642]: I0319 11:54:02.026241 7642 manager.go:1196] Started watching for new ooms in manager Mar 19 11:54:02.027951 master-0 kubenswrapper[7642]: I0319 11:54:02.026917 7642 manager.go:319] Starting recovery of all containers Mar 19 11:54:02.027951 master-0 kubenswrapper[7642]: I0319 11:54:02.027127 7642 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029782 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029840 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds" seLinuxMountContext="" Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029858 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99ffcc9a-4b01-439a-93e0-0674bdfd32ec" volumeName="kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875" seLinuxMountContext="" Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029872 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac946049-4b93-4d8c-bfa6-eacf83160ed1" volumeName="kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config" seLinuxMountContext="" Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029887 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token" seLinuxMountContext="" Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029900 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm" seLinuxMountContext="" Mar 19 11:54:02.029907 master-0 kubenswrapper[7642]: I0319 11:54:02.029917 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f3c0234-248a-4d6f-9aca-6582a481a911" volumeName="kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.029932 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.029950 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d3b2df-3d81-413e-b039-80dc20111eae" volumeName="kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.029967 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.029983 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac946049-4b93-4d8c-bfa6-eacf83160ed1" volumeName="kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.029998 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" volumeName="kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.030011 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" volumeName="kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.030047 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.030173 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.030190 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f3c0234-248a-4d6f-9aca-6582a481a911" volumeName="kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.030204 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ae2d717-506b-4d8e-803b-689c1cc3557c" volumeName="kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q" seLinuxMountContext="" Mar 19 11:54:02.030235 master-0 kubenswrapper[7642]: I0319 11:54:02.030242 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d017409-a362-4e58-87af-d02b3834fdd4" volumeName="kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030258 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030274 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0fed1dcc-6fc0-41ef-8236-e095dca4ddcc" volumeName="kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030288 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f479b3-45ff-43f9-ae85-083554a54918" volumeName="kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030301 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030314 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030327 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030341 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33f6e54f-a750-4c91-b4f5-049275d33cfb" volumeName="kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030358 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60f846d9-1b3d-4b9d-b3e0-90a238e02640" volumeName="kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030374 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e8c4676-05af-4478-a44b-1bdcbefdea0f" volumeName="kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030394 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030488 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="131e2573-ef74-4963-bdcf-901310e7a486" volumeName="kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030505 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" volumeName="kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030518 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d017409-a362-4e58-87af-d02b3834fdd4" volumeName="kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config" seLinuxMountContext="" Mar 19 11:54:02.030529 master-0 kubenswrapper[7642]: I0319 11:54:02.030529 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bea42722-d7af-4938-9f3a-da57bd056f96" volumeName="kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030542 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030722 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" volumeName="kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030742 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64136215-8539-4349-977f-6d59deb132ea" volumeName="kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030753 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b0133cb5-e425-40af-a051-71b2362a89c3" volumeName="kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030764 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030776 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367ea4e8-8172-4730-8f26-499ed7c167bb" volumeName="kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030788 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4705c8f3-89d4-42f2-ad14-e591c5380b9e" volumeName="kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030799 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b0133cb5-e425-40af-a051-71b2362a89c3" volumeName="kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030812 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030847 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f479b3-45ff-43f9-ae85-083554a54918" volumeName="kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030860 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d3b2df-3d81-413e-b039-80dc20111eae" volumeName="kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030873 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84e5577e-df8b-46a2-aff7-0bf90b5010d9" volumeName="kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030940 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94fa5656-0561-45ab-9ab1-a6b96b8a47d4" volumeName="kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030957 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030972 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bea42722-d7af-4938-9f3a-da57bd056f96" volumeName="kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.030988 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 11:54:02.031002 master-0 kubenswrapper[7642]: I0319 11:54:02.031003 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031071 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" volumeName="kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031088 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031101 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031119 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031132 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f3c0234-248a-4d6f-9aca-6582a481a911" volumeName="kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031144 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031161 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d017409-a362-4e58-87af-d02b3834fdd4" volumeName="kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031176 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031188 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b46c6b4-f701-4a7f-bf74-90afe14ad89a" volumeName="kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031219 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64136215-8539-4349-977f-6d59deb132ea" volumeName="kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031231 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84e5577e-df8b-46a2-aff7-0bf90b5010d9" volumeName="kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031245 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031256 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac946049-4b93-4d8c-bfa6-eacf83160ed1" volumeName="kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031268 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031347 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031363 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031378 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0fed1dcc-6fc0-41ef-8236-e095dca4ddcc" volumeName="kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031390 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0fed1dcc-6fc0-41ef-8236-e095dca4ddcc" volumeName="kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031401 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4705c8f3-89d4-42f2-ad14-e591c5380b9e" volumeName="kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031413 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" volumeName="kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031427 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031438 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031451 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367ea4e8-8172-4730-8f26-499ed7c167bb" volumeName="kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config" seLinuxMountContext="" Mar 19 11:54:02.031457 master-0 kubenswrapper[7642]: I0319 11:54:02.031467 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60f846d9-1b3d-4b9d-b3e0-90a238e02640" volumeName="kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031511 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031526 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031538 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4705c8f3-89d4-42f2-ad14-e591c5380b9e" volumeName="kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031549 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031561 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bea42722-d7af-4938-9f3a-da57bd056f96" volumeName="kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031575 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" volumeName="kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031588 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031603 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031621 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="131e2573-ef74-4963-bdcf-901310e7a486" volumeName="kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031667 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367ea4e8-8172-4730-8f26-499ed7c167bb" volumeName="kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031686 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84e5577e-df8b-46a2-aff7-0bf90b5010d9" volumeName="kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031701 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031715 7642 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config" seLinuxMountContext="" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031729 7642 reconstruct.go:97] "Volume reconstruction finished" Mar 19 11:54:02.032165 master-0 kubenswrapper[7642]: I0319 11:54:02.031738 7642 reconciler.go:26] "Reconciler: start to sync state" Mar 19 11:54:02.035224 master-0 kubenswrapper[7642]: I0319 11:54:02.035182 7642 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 11:54:02.064525 master-0 kubenswrapper[7642]: I0319 11:54:02.064420 7642 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 11:54:02.066496 master-0 kubenswrapper[7642]: I0319 11:54:02.066452 7642 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 11:54:02.066552 master-0 kubenswrapper[7642]: I0319 11:54:02.066499 7642 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 11:54:02.066552 master-0 kubenswrapper[7642]: I0319 11:54:02.066531 7642 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 11:54:02.067723 master-0 kubenswrapper[7642]: E0319 11:54:02.066676 7642 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 11:54:02.068550 master-0 kubenswrapper[7642]: I0319 11:54:02.068505 7642 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 11:54:02.094088 master-0 kubenswrapper[7642]: I0319 11:54:02.094017 7642 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="76315438a4a762dabbb5d7e77ae1a8b1b6d345329e45b5b2c2feae26b7df9765" exitCode=0 Mar 19 11:54:02.102530 master-0 kubenswrapper[7642]: I0319 11:54:02.102479 7642 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="61ceef49c4b3e3c92de2f92ba07df5cc9f3437ed0a986fcdd8cd9a85e9afbe78" exitCode=0 Mar 19 11:54:02.102530 master-0 kubenswrapper[7642]: I0319 11:54:02.102523 7642 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="c0d30e67fd289a28538ef92cb1f3d25d05768acb2306cf21bc6d90958cdbdf09" exitCode=0 Mar 19 11:54:02.102530 master-0 kubenswrapper[7642]: I0319 11:54:02.102533 7642 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="ff56157a2c5ab60201123fe865c3891754204ecbb4e8ba10184f66ac03510978" exitCode=0 Mar 19 11:54:02.102530 master-0 kubenswrapper[7642]: I0319 11:54:02.102541 7642 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="20f4956468238d356cba6c4e8281a37386fede658fd1e93306b89426a80c8130" exitCode=0 Mar 19 11:54:02.102758 master-0 kubenswrapper[7642]: I0319 11:54:02.102550 7642 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="57e2b9e5df1e4dbc4f600dfa7e9acded55e04fe34916a6250fcf8bdcd8821c9c" exitCode=0 Mar 19 11:54:02.102758 master-0 kubenswrapper[7642]: I0319 11:54:02.102557 7642 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="7e880bfa7477300228e9f011e131c05d9e3d415f1d3d007729e1d9e7b578b4cc" exitCode=0 Mar 19 11:54:02.112957 master-0 kubenswrapper[7642]: I0319 11:54:02.112915 7642 generic.go:334] "Generic (PLEG): container finished" podID="6568c544-a539-4d50-89f0-1705a5f63411" containerID="f2ca0a192f7dfe6f6eb1479534201a334d6844184e4ca66097e40894d0ccce20" exitCode=0 Mar 19 11:54:02.115052 master-0 kubenswrapper[7642]: I0319 11:54:02.115002 7642 generic.go:334] "Generic (PLEG): container finished" podID="2452a909-6aa1-4750-9399-5945d380427a" containerID="ec93bf01fde32bcc52dde2502ea534c23c521da7002fa0839646b010fe2c3b0c" exitCode=0 Mar 19 11:54:02.118838 master-0 kubenswrapper[7642]: I0319 11:54:02.118806 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 11:54:02.119325 master-0 kubenswrapper[7642]: I0319 11:54:02.119281 7642 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb" exitCode=1 Mar 19 11:54:02.119325 master-0 kubenswrapper[7642]: I0319 11:54:02.119315 7642 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="763a20d7b98822675af380cf6397cfba29e067a2eddf07fc09dc4a05e4d6fa6d" exitCode=0 Mar 19 11:54:02.128659 master-0 kubenswrapper[7642]: I0319 11:54:02.128565 7642 generic.go:334] "Generic (PLEG): container finished" podID="e617e515-a899-41e8-9961-df68e49da4d3" containerID="ffac3bdf0b92a0122fbb8d6a02eb01304705a3c19a78b9c431bf0840fcd9389f" exitCode=0 Mar 19 11:54:02.140267 master-0 kubenswrapper[7642]: I0319 11:54:02.140202 7642 generic.go:334] "Generic (PLEG): container finished" podID="4705c8f3-89d4-42f2-ad14-e591c5380b9e" containerID="b8135311023b73f1dff2f33a6eb221f0fcb0bdc9e1cd016fb390a5b4192441f6" exitCode=0 Mar 19 11:54:02.147463 master-0 kubenswrapper[7642]: I0319 11:54:02.147423 7642 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" exitCode=0 Mar 19 11:54:02.162875 master-0 kubenswrapper[7642]: I0319 11:54:02.162829 7642 manager.go:324] Recovery completed Mar 19 11:54:02.166848 master-0 kubenswrapper[7642]: E0319 11:54:02.166821 7642 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 11:54:02.200201 master-0 kubenswrapper[7642]: I0319 11:54:02.200182 7642 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 11:54:02.200355 master-0 kubenswrapper[7642]: I0319 11:54:02.200340 7642 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 11:54:02.200473 master-0 kubenswrapper[7642]: I0319 11:54:02.200462 7642 state_mem.go:36] "Initialized new in-memory state store" Mar 19 11:54:02.200778 master-0 kubenswrapper[7642]: I0319 11:54:02.200761 7642 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 11:54:02.200876 master-0 kubenswrapper[7642]: I0319 11:54:02.200845 7642 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 11:54:02.200942 master-0 kubenswrapper[7642]: I0319 11:54:02.200932 7642 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 11:54:02.201020 master-0 kubenswrapper[7642]: I0319 11:54:02.201008 7642 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 11:54:02.201093 master-0 kubenswrapper[7642]: I0319 11:54:02.201081 7642 policy_none.go:49] "None policy: Start" Mar 19 11:54:02.202504 master-0 kubenswrapper[7642]: I0319 11:54:02.202487 7642 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 11:54:02.202618 master-0 kubenswrapper[7642]: I0319 11:54:02.202606 7642 state_mem.go:35] "Initializing new in-memory state store" Mar 19 11:54:02.202919 master-0 kubenswrapper[7642]: I0319 11:54:02.202906 7642 state_mem.go:75] "Updated machine memory state" Mar 19 11:54:02.203003 master-0 kubenswrapper[7642]: I0319 11:54:02.202992 7642 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 11:54:02.215241 master-0 kubenswrapper[7642]: I0319 11:54:02.215192 7642 manager.go:334] "Starting Device Plugin manager" Mar 19 11:54:02.215390 master-0 kubenswrapper[7642]: I0319 11:54:02.215261 7642 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 11:54:02.215390 master-0 kubenswrapper[7642]: I0319 11:54:02.215279 7642 server.go:79] "Starting device plugin registration server" Mar 19 11:54:02.215962 master-0 kubenswrapper[7642]: I0319 11:54:02.215937 7642 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 11:54:02.216027 master-0 kubenswrapper[7642]: I0319 11:54:02.215959 7642 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 11:54:02.216202 master-0 kubenswrapper[7642]: I0319 11:54:02.216157 7642 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 11:54:02.216293 master-0 kubenswrapper[7642]: I0319 11:54:02.216282 7642 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 11:54:02.216350 master-0 kubenswrapper[7642]: I0319 11:54:02.216292 7642 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 11:54:02.316569 master-0 kubenswrapper[7642]: I0319 11:54:02.316428 7642 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 11:54:02.318573 master-0 kubenswrapper[7642]: I0319 11:54:02.318537 7642 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 11:54:02.318649 master-0 kubenswrapper[7642]: I0319 11:54:02.318580 7642 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 11:54:02.318649 master-0 kubenswrapper[7642]: I0319 11:54:02.318591 7642 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 11:54:02.318714 master-0 kubenswrapper[7642]: I0319 11:54:02.318682 7642 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 11:54:02.367320 master-0 kubenswrapper[7642]: I0319 11:54:02.367213 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.367825 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"a751b53c76eb83162f3d88044f06fc209b871d360365f0332303ca5b041a66eb"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.367936 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.367998 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368013 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368026 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"d664a6d0d2a24360dee10612610f1b59","Type":"ContainerStarted","Data":"97aaa7810039a211ed9eb18ec086e48818b5f53c2190c323cb9dff4622d57e09"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368055 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64eb4d527ff6202b4280c70b2f4cef7a3669bc686804b7cb047a8352c3664aff" Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368071 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff" Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368081 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"4d3040ecd2a7e633dfd078caf5a95302a7ae4f555f4e48340660df3b62be5004"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368097 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368114 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"763a20d7b98822675af380cf6397cfba29e067a2eddf07fc09dc4a05e4d6fa6d"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368128 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368168 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"79f33a58d544fe5d9dca133c52ef1b1ffeca07d3925205b6553e47537ece7b51"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368181 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"eb1ab82ad9ee6b742e302a5269a9952a1cd4daf4667abd8007cfbd42d95d64c7"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368194 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368209 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368221 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerDied","Data":"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4"} Mar 19 11:54:02.368401 master-0 kubenswrapper[7642]: I0319 11:54:02.368234 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"49fac1b46a11e49501805e891baae4a9","Type":"ContainerStarted","Data":"488f50284501a5ea1b55784b6ea985665fdedb86fee068aba655477316aea17a"} Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.436900 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.436980 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.437009 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.437068 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.437095 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.437119 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.437118 master-0 kubenswrapper[7642]: I0319 11:54:02.437151 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437177 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437204 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437234 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437258 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437290 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437315 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437340 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437364 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437389 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.437717 master-0 kubenswrapper[7642]: I0319 11:54:02.437416 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.538161 master-0 kubenswrapper[7642]: I0319 11:54:02.538094 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.538519 master-0 kubenswrapper[7642]: I0319 11:54:02.538501 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.538600 master-0 kubenswrapper[7642]: I0319 11:54:02.538313 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.538708 master-0 kubenswrapper[7642]: I0319 11:54:02.538583 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.538823 master-0 kubenswrapper[7642]: I0319 11:54:02.538754 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.538863 master-0 kubenswrapper[7642]: I0319 11:54:02.538778 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:02.538926 master-0 kubenswrapper[7642]: I0319 11:54:02.538864 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.538969 master-0 kubenswrapper[7642]: I0319 11:54:02.538930 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.538969 master-0 kubenswrapper[7642]: I0319 11:54:02.538899 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539029 master-0 kubenswrapper[7642]: I0319 11:54:02.538990 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539029 master-0 kubenswrapper[7642]: I0319 11:54:02.539016 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.539087 master-0 kubenswrapper[7642]: I0319 11:54:02.539063 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.539120 master-0 kubenswrapper[7642]: I0319 11:54:02.539090 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:02.539120 master-0 kubenswrapper[7642]: I0319 11:54:02.539096 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539120 master-0 kubenswrapper[7642]: I0319 11:54:02.539112 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:02.539195 master-0 kubenswrapper[7642]: I0319 11:54:02.539133 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:02.539195 master-0 kubenswrapper[7642]: I0319 11:54:02.539138 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:02.539195 master-0 kubenswrapper[7642]: I0319 11:54:02.539154 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:02.539195 master-0 kubenswrapper[7642]: I0319 11:54:02.539175 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.539195 master-0 kubenswrapper[7642]: I0319 11:54:02.539178 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:02.539322 master-0 kubenswrapper[7642]: I0319 11:54:02.539193 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.539322 master-0 kubenswrapper[7642]: I0319 11:54:02.539225 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539322 master-0 kubenswrapper[7642]: I0319 11:54:02.539245 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539322 master-0 kubenswrapper[7642]: I0319 11:54:02.539262 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539322 master-0 kubenswrapper[7642]: I0319 11:54:02.539281 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:02.539322 master-0 kubenswrapper[7642]: I0319 11:54:02.539311 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539329 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539355 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539365 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539391 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539420 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"46f265536aba6292ead501bc9b49f327\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539441 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:02.539482 master-0 kubenswrapper[7642]: I0319 11:54:02.539474 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:02.539837 master-0 kubenswrapper[7642]: I0319 11:54:02.539821 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:03.004444 master-0 kubenswrapper[7642]: I0319 11:54:03.004312 7642 apiserver.go:52] "Watching apiserver" Mar 19 11:54:03.019979 master-0 kubenswrapper[7642]: I0319 11:54:03.019864 7642 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 11:54:03.021722 master-0 kubenswrapper[7642]: I0319 11:54:03.021651 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-r6tsg","openshift-dns-operator/dns-operator-9c5679d8f-m5dt4","openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn","openshift-ovn-kubernetes/ovnkube-node-hm6q6","kube-system/bootstrap-kube-controller-manager-master-0","openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4","openshift-ingress-operator/ingress-operator-66b84d69b-59ptg","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-multus/network-metrics-daemon-nkd77","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b","openshift-etcd/etcd-master-0-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv","openshift-network-node-identity/network-node-identity-47j68","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj","openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw","openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc","openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr","openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg","openshift-network-operator/iptables-alerter-j6n2j","openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b","openshift-network-diagnostics/network-check-target-tz4qg","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht","openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5","openshift-multus/multus-p5wll","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-marketplace/marketplace-operator-89ccd998f-4h6xn","openshift-multus/multus-additional-cni-plugins-n5wqf","kube-system/bootstrap-kube-scheduler-master-0","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj","openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj","openshift-network-operator/network-operator-7bd846bfc4-5qtx7","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf"] Mar 19 11:54:03.022024 master-0 kubenswrapper[7642]: I0319 11:54:03.021996 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 11:54:03.022181 master-0 kubenswrapper[7642]: I0319 11:54:03.022145 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:03.023901 master-0 kubenswrapper[7642]: I0319 11:54:03.023862 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.023901 master-0 kubenswrapper[7642]: I0319 11:54:03.023906 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.024193 master-0 kubenswrapper[7642]: I0319 11:54:03.023938 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:03.024531 master-0 kubenswrapper[7642]: I0319 11:54:03.024514 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 11:54:03.024695 master-0 kubenswrapper[7642]: I0319 11:54:03.024677 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.025272 master-0 kubenswrapper[7642]: I0319 11:54:03.025241 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.025272 master-0 kubenswrapper[7642]: I0319 11:54:03.025245 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:03.025829 master-0 kubenswrapper[7642]: I0319 11:54:03.025803 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.027155 master-0 kubenswrapper[7642]: I0319 11:54:03.027113 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.027511 master-0 kubenswrapper[7642]: I0319 11:54:03.027440 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.027622 master-0 kubenswrapper[7642]: I0319 11:54:03.027591 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:03.027794 master-0 kubenswrapper[7642]: I0319 11:54:03.027752 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:03.027958 master-0 kubenswrapper[7642]: I0319 11:54:03.027928 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.032463 master-0 kubenswrapper[7642]: I0319 11:54:03.032388 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:54:03.032738 master-0 kubenswrapper[7642]: I0319 11:54:03.032672 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 11:54:03.032738 master-0 kubenswrapper[7642]: I0319 11:54:03.032692 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 11:54:03.032975 master-0 kubenswrapper[7642]: I0319 11:54:03.032841 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 11:54:03.033018 master-0 kubenswrapper[7642]: I0319 11:54:03.032976 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 11:54:03.033325 master-0 kubenswrapper[7642]: I0319 11:54:03.033286 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 11:54:03.033523 master-0 kubenswrapper[7642]: I0319 11:54:03.033492 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 11:54:03.033843 master-0 kubenswrapper[7642]: I0319 11:54:03.033680 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 11:54:03.039874 master-0 kubenswrapper[7642]: I0319 11:54:03.033960 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 11:54:03.039874 master-0 kubenswrapper[7642]: I0319 11:54:03.034015 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.039874 master-0 kubenswrapper[7642]: I0319 11:54:03.034056 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 11:54:03.039874 master-0 kubenswrapper[7642]: I0319 11:54:03.034233 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 11:54:03.039874 master-0 kubenswrapper[7642]: I0319 11:54:03.036305 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:03.043032 master-0 kubenswrapper[7642]: I0319 11:54:03.040140 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 11:54:03.043032 master-0 kubenswrapper[7642]: I0319 11:54:03.040950 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 11:54:03.043032 master-0 kubenswrapper[7642]: I0319 11:54:03.040952 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:54:03.043032 master-0 kubenswrapper[7642]: I0319 11:54:03.041988 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:03.044658 master-0 kubenswrapper[7642]: I0319 11:54:03.044617 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.044943 master-0 kubenswrapper[7642]: I0319 11:54:03.044872 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 11:54:03.045829 master-0 kubenswrapper[7642]: I0319 11:54:03.045804 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 11:54:03.045904 master-0 kubenswrapper[7642]: I0319 11:54:03.045833 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 11:54:03.045904 master-0 kubenswrapper[7642]: I0319 11:54:03.045890 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 11:54:03.045981 master-0 kubenswrapper[7642]: I0319 11:54:03.045925 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 11:54:03.046023 master-0 kubenswrapper[7642]: I0319 11:54:03.045996 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:54:03.046098 master-0 kubenswrapper[7642]: I0319 11:54:03.046068 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 11:54:03.046243 master-0 kubenswrapper[7642]: I0319 11:54:03.046218 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 11:54:03.046367 master-0 kubenswrapper[7642]: I0319 11:54:03.046339 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 11:54:03.046446 master-0 kubenswrapper[7642]: I0319 11:54:03.046424 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.046503 master-0 kubenswrapper[7642]: I0319 11:54:03.045950 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:54:03.046543 master-0 kubenswrapper[7642]: I0319 11:54:03.046505 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 11:54:03.046580 master-0 kubenswrapper[7642]: I0319 11:54:03.046553 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 11:54:03.046619 master-0 kubenswrapper[7642]: I0319 11:54:03.046576 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 11:54:03.046680 master-0 kubenswrapper[7642]: I0319 11:54:03.046619 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 11:54:03.046727 master-0 kubenswrapper[7642]: I0319 11:54:03.046676 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 11:54:03.046727 master-0 kubenswrapper[7642]: I0319 11:54:03.046708 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 11:54:03.046802 master-0 kubenswrapper[7642]: I0319 11:54:03.046354 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 11:54:03.046802 master-0 kubenswrapper[7642]: I0319 11:54:03.046769 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 11:54:03.046889 master-0 kubenswrapper[7642]: I0319 11:54:03.046851 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 11:54:03.046889 master-0 kubenswrapper[7642]: I0319 11:54:03.046437 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 11:54:03.046980 master-0 kubenswrapper[7642]: I0319 11:54:03.046912 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 11:54:03.046980 master-0 kubenswrapper[7642]: I0319 11:54:03.046905 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 11:54:03.046980 master-0 kubenswrapper[7642]: I0319 11:54:03.046286 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 11:54:03.047089 master-0 kubenswrapper[7642]: I0319 11:54:03.047006 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 11:54:03.047127 master-0 kubenswrapper[7642]: I0319 11:54:03.047018 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 11:54:03.047173 master-0 kubenswrapper[7642]: I0319 11:54:03.047129 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:54:03.047219 master-0 kubenswrapper[7642]: I0319 11:54:03.047175 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 11:54:03.047219 master-0 kubenswrapper[7642]: I0319 11:54:03.047187 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.047305 master-0 kubenswrapper[7642]: I0319 11:54:03.047217 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:54:03.047305 master-0 kubenswrapper[7642]: I0319 11:54:03.047250 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:03.047305 master-0 kubenswrapper[7642]: I0319 11:54:03.047179 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.047414 master-0 kubenswrapper[7642]: I0319 11:54:03.047273 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:03.047414 master-0 kubenswrapper[7642]: I0319 11:54:03.046086 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 11:54:03.047493 master-0 kubenswrapper[7642]: I0319 11:54:03.047422 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:54:03.047493 master-0 kubenswrapper[7642]: I0319 11:54:03.046952 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 11:54:03.047493 master-0 kubenswrapper[7642]: I0319 11:54:03.047469 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:54:03.047597 master-0 kubenswrapper[7642]: I0319 11:54:03.047504 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.047597 master-0 kubenswrapper[7642]: I0319 11:54:03.047075 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 11:54:03.047703 master-0 kubenswrapper[7642]: I0319 11:54:03.047574 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:03.047746 master-0 kubenswrapper[7642]: I0319 11:54:03.047703 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.047746 master-0 kubenswrapper[7642]: I0319 11:54:03.046267 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.047746 master-0 kubenswrapper[7642]: I0319 11:54:03.047737 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.047850 master-0 kubenswrapper[7642]: I0319 11:54:03.047780 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:54:03.047850 master-0 kubenswrapper[7642]: I0319 11:54:03.047819 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:54:03.047937 master-0 kubenswrapper[7642]: I0319 11:54:03.047853 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:03.047937 master-0 kubenswrapper[7642]: I0319 11:54:03.047871 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:54:03.047937 master-0 kubenswrapper[7642]: I0319 11:54:03.047893 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.048053 master-0 kubenswrapper[7642]: I0319 11:54:03.047940 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:54:03.048089 master-0 kubenswrapper[7642]: I0319 11:54:03.048060 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:54:03.048187 master-0 kubenswrapper[7642]: I0319 11:54:03.046102 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.048316 master-0 kubenswrapper[7642]: I0319 11:54:03.048283 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.048378 master-0 kubenswrapper[7642]: I0319 11:54:03.046590 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 11:54:03.048473 master-0 kubenswrapper[7642]: I0319 11:54:03.048177 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.048604 master-0 kubenswrapper[7642]: I0319 11:54:03.046507 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 11:54:03.048703 master-0 kubenswrapper[7642]: I0319 11:54:03.048680 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 11:54:03.048821 master-0 kubenswrapper[7642]: I0319 11:54:03.048606 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.048874 master-0 kubenswrapper[7642]: I0319 11:54:03.048826 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:54:03.048874 master-0 kubenswrapper[7642]: I0319 11:54:03.048849 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 11:54:03.048992 master-0 kubenswrapper[7642]: I0319 11:54:03.048906 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:54:03.049036 master-0 kubenswrapper[7642]: I0319 11:54:03.049007 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 11:54:03.049036 master-0 kubenswrapper[7642]: I0319 11:54:03.049015 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 11:54:03.049216 master-0 kubenswrapper[7642]: I0319 11:54:03.049164 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 11:54:03.049275 master-0 kubenswrapper[7642]: I0319 11:54:03.049223 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.049275 master-0 kubenswrapper[7642]: I0319 11:54:03.049182 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:03.049275 master-0 kubenswrapper[7642]: I0319 11:54:03.049264 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:54:03.049390 master-0 kubenswrapper[7642]: I0319 11:54:03.049268 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.049546 master-0 kubenswrapper[7642]: I0319 11:54:03.049478 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 11:54:03.049546 master-0 kubenswrapper[7642]: I0319 11:54:03.049491 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 11:54:03.049742 master-0 kubenswrapper[7642]: I0319 11:54:03.049621 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:03.049810 master-0 kubenswrapper[7642]: I0319 11:54:03.049776 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:03.049859 master-0 kubenswrapper[7642]: I0319 11:54:03.049817 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.049859 master-0 kubenswrapper[7642]: I0319 11:54:03.049842 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.049931 master-0 kubenswrapper[7642]: I0319 11:54:03.049879 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.049931 master-0 kubenswrapper[7642]: I0319 11:54:03.049879 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.049931 master-0 kubenswrapper[7642]: I0319 11:54:03.049881 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 11:54:03.050046 master-0 kubenswrapper[7642]: I0319 11:54:03.049906 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:03.050121 master-0 kubenswrapper[7642]: I0319 11:54:03.050094 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.050183 master-0 kubenswrapper[7642]: I0319 11:54:03.050147 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.050230 master-0 kubenswrapper[7642]: I0319 11:54:03.050187 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:54:03.050272 master-0 kubenswrapper[7642]: I0319 11:54:03.050225 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:54:03.050272 master-0 kubenswrapper[7642]: I0319 11:54:03.050260 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:54:03.050413 master-0 kubenswrapper[7642]: I0319 11:54:03.050106 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:03.050469 master-0 kubenswrapper[7642]: I0319 11:54:03.050375 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.050469 master-0 kubenswrapper[7642]: I0319 11:54:03.050229 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 11:54:03.050469 master-0 kubenswrapper[7642]: I0319 11:54:03.050294 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 11:54:03.050596 master-0 kubenswrapper[7642]: I0319 11:54:03.050506 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 11:54:03.050596 master-0 kubenswrapper[7642]: I0319 11:54:03.050334 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 11:54:03.050743 master-0 kubenswrapper[7642]: I0319 11:54:03.050368 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 11:54:03.050798 master-0 kubenswrapper[7642]: I0319 11:54:03.050013 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:54:03.050847 master-0 kubenswrapper[7642]: I0319 11:54:03.050829 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 11:54:03.051161 master-0 kubenswrapper[7642]: I0319 11:54:03.051125 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.051228 master-0 kubenswrapper[7642]: I0319 11:54:03.051163 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 11:54:03.051272 master-0 kubenswrapper[7642]: I0319 11:54:03.051233 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 11:54:03.051418 master-0 kubenswrapper[7642]: I0319 11:54:03.051366 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 11:54:03.051503 master-0 kubenswrapper[7642]: I0319 11:54:03.051482 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 11:54:03.051618 master-0 kubenswrapper[7642]: I0319 11:54:03.051600 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.052208 master-0 kubenswrapper[7642]: I0319 11:54:03.052148 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.052619 master-0 kubenswrapper[7642]: I0319 11:54:03.052487 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 11:54:03.053293 master-0 kubenswrapper[7642]: I0319 11:54:03.053232 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:54:03.053624 master-0 kubenswrapper[7642]: I0319 11:54:03.053599 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 11:54:03.053967 master-0 kubenswrapper[7642]: I0319 11:54:03.053932 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 11:54:03.054021 master-0 kubenswrapper[7642]: I0319 11:54:03.053993 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 11:54:03.055471 master-0 kubenswrapper[7642]: I0319 11:54:03.055197 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 11:54:03.055803 master-0 kubenswrapper[7642]: I0319 11:54:03.055666 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 11:54:03.055803 master-0 kubenswrapper[7642]: I0319 11:54:03.055772 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 11:54:03.056508 master-0 kubenswrapper[7642]: I0319 11:54:03.056478 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 11:54:03.057313 master-0 kubenswrapper[7642]: I0319 11:54:03.057280 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 11:54:03.058149 master-0 kubenswrapper[7642]: I0319 11:54:03.058134 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 11:54:03.059146 master-0 kubenswrapper[7642]: I0319 11:54:03.058884 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.059198 master-0 kubenswrapper[7642]: I0319 11:54:03.058850 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 11:54:03.059297 master-0 kubenswrapper[7642]: I0319 11:54:03.058883 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.059390 master-0 kubenswrapper[7642]: I0319 11:54:03.058953 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 11:54:03.059428 master-0 kubenswrapper[7642]: I0319 11:54:03.059004 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 11:54:03.059460 master-0 kubenswrapper[7642]: I0319 11:54:03.059056 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 11:54:03.059529 master-0 kubenswrapper[7642]: I0319 11:54:03.059101 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 11:54:03.060182 master-0 kubenswrapper[7642]: I0319 11:54:03.059792 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.060182 master-0 kubenswrapper[7642]: I0319 11:54:03.060169 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 11:54:03.060707 master-0 kubenswrapper[7642]: I0319 11:54:03.060688 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 11:54:03.061738 master-0 kubenswrapper[7642]: I0319 11:54:03.061697 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:54:03.063236 master-0 kubenswrapper[7642]: I0319 11:54:03.063072 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 11:54:03.064274 master-0 kubenswrapper[7642]: I0319 11:54:03.064252 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 11:54:03.067339 master-0 kubenswrapper[7642]: I0319 11:54:03.067307 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 11:54:03.072267 master-0 kubenswrapper[7642]: I0319 11:54:03.072205 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 11:54:03.075134 master-0 kubenswrapper[7642]: I0319 11:54:03.075061 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.076840 master-0 kubenswrapper[7642]: I0319 11:54:03.076354 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 11:54:03.078973 master-0 kubenswrapper[7642]: I0319 11:54:03.078825 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 11:54:03.080403 master-0 kubenswrapper[7642]: I0319 11:54:03.080365 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.086208 master-0 kubenswrapper[7642]: I0319 11:54:03.086162 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 11:54:03.107987 master-0 kubenswrapper[7642]: I0319 11:54:03.107920 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 11:54:03.125570 master-0 kubenswrapper[7642]: I0319 11:54:03.125513 7642 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 11:54:03.126677 master-0 kubenswrapper[7642]: I0319 11:54:03.126619 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 11:54:03.147431 master-0 kubenswrapper[7642]: I0319 11:54:03.147371 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.150753 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.150784 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.150826 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.150947 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.150999 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151031 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151051 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151075 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151099 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151121 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151190 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151244 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151250 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.151274 master-0 kubenswrapper[7642]: I0319 11:54:03.151348 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151396 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151426 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151451 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151571 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151570 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151681 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151684 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151711 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151745 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151785 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151807 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151852 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151897 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.151948 master-0 kubenswrapper[7642]: I0319 11:54:03.151955 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.151989 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152002 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152042 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152101 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152370 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152418 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152446 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152478 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152510 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152540 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152544 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152589 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152625 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152690 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152704 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152715 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152733 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152944 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152997 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153038 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.152908 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153110 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153152 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153250 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153327 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153376 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153407 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.153510 master-0 kubenswrapper[7642]: I0319 11:54:03.153488 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.153707 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.153969 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.153980 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154043 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154118 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154193 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154250 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154331 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154354 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154381 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.154472 master-0 kubenswrapper[7642]: I0319 11:54:03.154420 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.154850 master-0 kubenswrapper[7642]: I0319 11:54:03.154581 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.154850 master-0 kubenswrapper[7642]: I0319 11:54:03.154681 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.154850 master-0 kubenswrapper[7642]: I0319 11:54:03.154709 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.154850 master-0 kubenswrapper[7642]: I0319 11:54:03.154731 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:54:03.154850 master-0 kubenswrapper[7642]: I0319 11:54:03.154750 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:54:03.154850 master-0 kubenswrapper[7642]: I0319 11:54:03.154771 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155046 master-0 kubenswrapper[7642]: I0319 11:54:03.154939 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.155046 master-0 kubenswrapper[7642]: I0319 11:54:03.154980 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.155046 master-0 kubenswrapper[7642]: I0319 11:54:03.155008 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155046 master-0 kubenswrapper[7642]: I0319 11:54:03.155012 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.155190 master-0 kubenswrapper[7642]: I0319 11:54:03.155058 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.155190 master-0 kubenswrapper[7642]: I0319 11:54:03.155072 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:54:03.155190 master-0 kubenswrapper[7642]: I0319 11:54:03.155010 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.155190 master-0 kubenswrapper[7642]: I0319 11:54:03.155080 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.155190 master-0 kubenswrapper[7642]: I0319 11:54:03.155129 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:03.155190 master-0 kubenswrapper[7642]: I0319 11:54:03.155158 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155382 master-0 kubenswrapper[7642]: I0319 11:54:03.155199 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.155382 master-0 kubenswrapper[7642]: I0319 11:54:03.155249 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.155382 master-0 kubenswrapper[7642]: I0319 11:54:03.155275 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.155382 master-0 kubenswrapper[7642]: I0319 11:54:03.155351 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.155382 master-0 kubenswrapper[7642]: I0319 11:54:03.155377 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155544 master-0 kubenswrapper[7642]: I0319 11:54:03.155381 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.155544 master-0 kubenswrapper[7642]: I0319 11:54:03.155402 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.155544 master-0 kubenswrapper[7642]: I0319 11:54:03.155417 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155544 master-0 kubenswrapper[7642]: I0319 11:54:03.155427 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.155544 master-0 kubenswrapper[7642]: I0319 11:54:03.155491 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155544 master-0 kubenswrapper[7642]: I0319 11:54:03.155517 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.155763 master-0 kubenswrapper[7642]: I0319 11:54:03.155596 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.155763 master-0 kubenswrapper[7642]: I0319 11:54:03.155630 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:03.155763 master-0 kubenswrapper[7642]: I0319 11:54:03.155679 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.155763 master-0 kubenswrapper[7642]: I0319 11:54:03.155705 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.155763 master-0 kubenswrapper[7642]: I0319 11:54:03.155730 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: I0319 11:54:03.155764 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: I0319 11:54:03.155796 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: E0319 11:54:03.155813 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: I0319 11:54:03.155842 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: I0319 11:54:03.155886 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: I0319 11:54:03.155879 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.155933 master-0 kubenswrapper[7642]: I0319 11:54:03.155931 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: E0319 11:54:03.155974 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.655945937 +0000 UTC m=+1.773386914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: I0319 11:54:03.155999 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: I0319 11:54:03.156034 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: I0319 11:54:03.156057 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: I0319 11:54:03.156080 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: I0319 11:54:03.156105 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:54:03.156154 master-0 kubenswrapper[7642]: I0319 11:54:03.156128 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156155 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156150 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156204 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156278 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156279 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156289 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156309 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.156391 master-0 kubenswrapper[7642]: I0319 11:54:03.156364 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:54:03.156672 master-0 kubenswrapper[7642]: I0319 11:54:03.156414 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.156672 master-0 kubenswrapper[7642]: I0319 11:54:03.156439 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.156672 master-0 kubenswrapper[7642]: I0319 11:54:03.156567 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:54:03.156672 master-0 kubenswrapper[7642]: I0319 11:54:03.156586 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.156818 master-0 kubenswrapper[7642]: I0319 11:54:03.156741 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.156818 master-0 kubenswrapper[7642]: I0319 11:54:03.156773 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:54:03.156818 master-0 kubenswrapper[7642]: I0319 11:54:03.156796 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.156924 master-0 kubenswrapper[7642]: I0319 11:54:03.156825 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.156924 master-0 kubenswrapper[7642]: I0319 11:54:03.156834 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.156924 master-0 kubenswrapper[7642]: I0319 11:54:03.156850 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.156935 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.156949 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.156985 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.157017 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.157044 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.157059 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.157099 master-0 kubenswrapper[7642]: I0319 11:54:03.157065 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157108 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157133 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157165 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157239 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157276 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157304 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.157334 master-0 kubenswrapper[7642]: I0319 11:54:03.157198 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.157504 master-0 kubenswrapper[7642]: I0319 11:54:03.157242 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:54:03.157504 master-0 kubenswrapper[7642]: E0319 11:54:03.157371 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:03.157504 master-0 kubenswrapper[7642]: I0319 11:54:03.157384 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.157582 master-0 kubenswrapper[7642]: E0319 11:54:03.157551 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.65753187 +0000 UTC m=+1.774972657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:03.167708 master-0 kubenswrapper[7642]: I0319 11:54:03.167650 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 11:54:03.174772 master-0 kubenswrapper[7642]: I0319 11:54:03.174665 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:54:03.205075 master-0 kubenswrapper[7642]: I0319 11:54:03.204998 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:03.207335 master-0 kubenswrapper[7642]: I0319 11:54:03.207295 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 11:54:03.208592 master-0 kubenswrapper[7642]: I0319 11:54:03.207718 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.212828 master-0 kubenswrapper[7642]: E0319 11:54:03.212782 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:54:03.213234 master-0 kubenswrapper[7642]: E0319 11:54:03.213200 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 11:54:03.213616 master-0 kubenswrapper[7642]: W0319 11:54:03.213532 7642 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 19 11:54:03.213706 master-0 kubenswrapper[7642]: E0319 11:54:03.213625 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:03.213925 master-0 kubenswrapper[7642]: E0319 11:54:03.213890 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:54:03.214208 master-0 kubenswrapper[7642]: E0319 11:54:03.214179 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:03.227519 master-0 kubenswrapper[7642]: I0319 11:54:03.227390 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 11:54:03.236374 master-0 kubenswrapper[7642]: I0319 11:54:03.236319 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.248191 master-0 kubenswrapper[7642]: I0319 11:54:03.248136 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 11:54:03.259374 master-0 kubenswrapper[7642]: I0319 11:54:03.259118 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.259374 master-0 kubenswrapper[7642]: I0319 11:54:03.259218 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.259374 master-0 kubenswrapper[7642]: I0319 11:54:03.259306 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.259374 master-0 kubenswrapper[7642]: I0319 11:54:03.259328 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.259374 master-0 kubenswrapper[7642]: I0319 11:54:03.259375 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.259822 master-0 kubenswrapper[7642]: I0319 11:54:03.259450 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.259822 master-0 kubenswrapper[7642]: I0319 11:54:03.259495 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.259822 master-0 kubenswrapper[7642]: E0319 11:54:03.259501 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:03.259822 master-0 kubenswrapper[7642]: I0319 11:54:03.259522 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.259822 master-0 kubenswrapper[7642]: E0319 11:54:03.259580 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.759558393 +0000 UTC m=+1.876999180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:03.259822 master-0 kubenswrapper[7642]: I0319 11:54:03.259680 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.260054 master-0 kubenswrapper[7642]: I0319 11:54:03.259808 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.260054 master-0 kubenswrapper[7642]: I0319 11:54:03.259857 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260054 master-0 kubenswrapper[7642]: I0319 11:54:03.259904 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.260054 master-0 kubenswrapper[7642]: I0319 11:54:03.259944 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260054 master-0 kubenswrapper[7642]: I0319 11:54:03.259992 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: E0319 11:54:03.260056 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: I0319 11:54:03.260121 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: E0319 11:54:03.260159 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.760121768 +0000 UTC m=+1.877562735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: I0319 11:54:03.260173 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: E0319 11:54:03.260202 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: E0319 11:54:03.260268 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.760247252 +0000 UTC m=+1.877688209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: I0319 11:54:03.260205 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.260331 master-0 kubenswrapper[7642]: I0319 11:54:03.260297 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260346 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260391 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: E0319 11:54:03.260419 7642 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260458 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: E0319 11:54:03.260477 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.760453838 +0000 UTC m=+1.877894645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260421 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260503 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: E0319 11:54:03.260544 7642 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260556 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260562 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: E0319 11:54:03.260575 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.760565241 +0000 UTC m=+1.878006028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260617 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: E0319 11:54:03.260663 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:03.260660 master-0 kubenswrapper[7642]: I0319 11:54:03.260666 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: E0319 11:54:03.260736 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.760710225 +0000 UTC m=+1.878151182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.260756 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.260785 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: E0319 11:54:03.260840 7642 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: E0319 11:54:03.260868 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.760859279 +0000 UTC m=+1.878300066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.260891 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.260945 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.260985 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.261034 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.261052 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.261146 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.261162 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.261187 master-0 kubenswrapper[7642]: I0319 11:54:03.261039 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: I0319 11:54:03.261236 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: E0319 11:54:03.261255 7642 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: I0319 11:54:03.261276 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: E0319 11:54:03.261283 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.76127598 +0000 UTC m=+1.878716767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: I0319 11:54:03.261327 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: I0319 11:54:03.261392 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: I0319 11:54:03.261433 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: I0319 11:54:03.261541 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: E0319 11:54:03.261560 7642 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:03.261715 master-0 kubenswrapper[7642]: E0319 11:54:03.261599 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.761589329 +0000 UTC m=+1.879030116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:03.263424 master-0 kubenswrapper[7642]: I0319 11:54:03.263366 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.263424 master-0 kubenswrapper[7642]: E0319 11:54:03.263420 7642 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263427 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: E0319 11:54:03.263479 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.76346153 +0000 UTC m=+1.880902337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263512 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263527 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263564 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263565 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263613 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263653 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263659 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: E0319 11:54:03.263670 7642 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263682 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: I0319 11:54:03.263711 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.263766 master-0 kubenswrapper[7642]: E0319 11:54:03.263725 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.763708777 +0000 UTC m=+1.881149584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263767 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: E0319 11:54:03.263809 7642 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263810 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263848 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263879 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263901 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: E0319 11:54:03.263914 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.763899762 +0000 UTC m=+1.881340569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263951 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.263974 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.264010 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.264037 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.264072 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.264086 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.264109 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264110 master-0 kubenswrapper[7642]: I0319 11:54:03.264092 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264142 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264170 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264210 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264209 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264238 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264278 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264281 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264369 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264419 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: I0319 11:54:03.264474 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264508 master-0 kubenswrapper[7642]: E0319 11:54:03.264480 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264541 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264510 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264509 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: E0319 11:54:03.264610 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:03.764582701 +0000 UTC m=+1.882023488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264732 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264758 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264779 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264829 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.264888 master-0 kubenswrapper[7642]: I0319 11:54:03.264865 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.265214 master-0 kubenswrapper[7642]: I0319 11:54:03.264924 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.265214 master-0 kubenswrapper[7642]: I0319 11:54:03.264954 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.265214 master-0 kubenswrapper[7642]: I0319 11:54:03.264965 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.266271 master-0 kubenswrapper[7642]: I0319 11:54:03.266239 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 11:54:03.306844 master-0 kubenswrapper[7642]: I0319 11:54:03.306771 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 11:54:03.319838 master-0 kubenswrapper[7642]: I0319 11:54:03.319763 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:03.327581 master-0 kubenswrapper[7642]: I0319 11:54:03.326346 7642 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:54:03.344956 master-0 kubenswrapper[7642]: I0319 11:54:03.344128 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:54:03.360471 master-0 kubenswrapper[7642]: I0319 11:54:03.360430 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 11:54:03.399668 master-0 kubenswrapper[7642]: I0319 11:54:03.399568 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:03.400308 master-0 kubenswrapper[7642]: I0319 11:54:03.400267 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 11:54:03.423062 master-0 kubenswrapper[7642]: I0319 11:54:03.422985 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 11:54:03.440922 master-0 kubenswrapper[7642]: I0319 11:54:03.440849 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 11:54:03.469578 master-0 kubenswrapper[7642]: I0319 11:54:03.469205 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.479256 master-0 kubenswrapper[7642]: I0319 11:54:03.479217 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 11:54:03.498595 master-0 kubenswrapper[7642]: I0319 11:54:03.498525 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.520001 master-0 kubenswrapper[7642]: I0319 11:54:03.519836 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:03.545141 master-0 kubenswrapper[7642]: I0319 11:54:03.545098 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:03.558694 master-0 kubenswrapper[7642]: I0319 11:54:03.558628 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:03.579935 master-0 kubenswrapper[7642]: I0319 11:54:03.579859 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:03.607552 master-0 kubenswrapper[7642]: I0319 11:54:03.607336 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.619669 master-0 kubenswrapper[7642]: I0319 11:54:03.619606 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 11:54:03.647577 master-0 kubenswrapper[7642]: I0319 11:54:03.647522 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 11:54:03.663385 master-0 kubenswrapper[7642]: I0319 11:54:03.663344 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 11:54:03.679547 master-0 kubenswrapper[7642]: I0319 11:54:03.679482 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:03.680165 master-0 kubenswrapper[7642]: I0319 11:54:03.680135 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:03.680441 master-0 kubenswrapper[7642]: E0319 11:54:03.680368 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:03.680558 master-0 kubenswrapper[7642]: E0319 11:54:03.680534 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.680507981 +0000 UTC m=+2.797948948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:03.680619 master-0 kubenswrapper[7642]: I0319 11:54:03.680598 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:03.680883 master-0 kubenswrapper[7642]: E0319 11:54:03.680837 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:03.680976 master-0 kubenswrapper[7642]: E0319 11:54:03.680942 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.680916382 +0000 UTC m=+2.798357179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:03.703239 master-0 kubenswrapper[7642]: I0319 11:54:03.703193 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 11:54:03.721252 master-0 kubenswrapper[7642]: I0319 11:54:03.721190 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.742148 master-0 kubenswrapper[7642]: I0319 11:54:03.740766 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.758204 master-0 kubenswrapper[7642]: I0319 11:54:03.758157 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 11:54:03.778980 master-0 kubenswrapper[7642]: I0319 11:54:03.778793 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.781740 master-0 kubenswrapper[7642]: I0319 11:54:03.781703 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:03.781836 master-0 kubenswrapper[7642]: I0319 11:54:03.781746 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:03.781836 master-0 kubenswrapper[7642]: I0319 11:54:03.781778 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:03.781943 master-0 kubenswrapper[7642]: E0319 11:54:03.781916 7642 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:03.782005 master-0 kubenswrapper[7642]: E0319 11:54:03.781994 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.781972829 +0000 UTC m=+2.899413806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:03.782071 master-0 kubenswrapper[7642]: I0319 11:54:03.782046 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:03.782190 master-0 kubenswrapper[7642]: E0319 11:54:03.782134 7642 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:03.782269 master-0 kubenswrapper[7642]: E0319 11:54:03.782250 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.782226846 +0000 UTC m=+2.899667813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:03.782325 master-0 kubenswrapper[7642]: E0319 11:54:03.782285 7642 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:03.782452 master-0 kubenswrapper[7642]: E0319 11:54:03.782422 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.782392151 +0000 UTC m=+2.899833078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:03.782557 master-0 kubenswrapper[7642]: I0319 11:54:03.782532 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.782592 master-0 kubenswrapper[7642]: E0319 11:54:03.782540 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:03.782592 master-0 kubenswrapper[7642]: I0319 11:54:03.782579 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:03.782682 master-0 kubenswrapper[7642]: E0319 11:54:03.782624 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:03.782682 master-0 kubenswrapper[7642]: E0319 11:54:03.782670 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.782620167 +0000 UTC m=+2.900060954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:03.782794 master-0 kubenswrapper[7642]: E0319 11:54:03.782776 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.782763071 +0000 UTC m=+2.900203858 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:03.783834 master-0 kubenswrapper[7642]: I0319 11:54:03.783801 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.783908 master-0 kubenswrapper[7642]: I0319 11:54:03.783854 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.783908 master-0 kubenswrapper[7642]: E0319 11:54:03.783884 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:03.783988 master-0 kubenswrapper[7642]: E0319 11:54:03.783945 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.783928463 +0000 UTC m=+2.901369430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:03.783988 master-0 kubenswrapper[7642]: I0319 11:54:03.783886 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:03.783988 master-0 kubenswrapper[7642]: E0319 11:54:03.783972 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:03.784125 master-0 kubenswrapper[7642]: I0319 11:54:03.784036 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.784125 master-0 kubenswrapper[7642]: E0319 11:54:03.784059 7642 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:03.784125 master-0 kubenswrapper[7642]: E0319 11:54:03.784098 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.784087267 +0000 UTC m=+2.901528054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: I0319 11:54:03.784134 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: E0319 11:54:03.784165 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: I0319 11:54:03.784178 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: E0319 11:54:03.784188 7642 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: E0319 11:54:03.784205 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.78419288 +0000 UTC m=+2.901633857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: E0319 11:54:03.784228 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.784217831 +0000 UTC m=+2.901658618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:03.784237 master-0 kubenswrapper[7642]: E0319 11:54:03.784234 7642 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: E0319 11:54:03.784285 7642 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: I0319 11:54:03.784327 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: E0319 11:54:03.784337 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.784324704 +0000 UTC m=+2.901765721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: E0319 11:54:03.784415 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.784401826 +0000 UTC m=+2.901842783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: E0319 11:54:03.784414 7642 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: E0319 11:54:03.784435 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.784424747 +0000 UTC m=+2.901865534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:03.784605 master-0 kubenswrapper[7642]: E0319 11:54:03.784475 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:04.784459548 +0000 UTC m=+2.901900545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:03.803777 master-0 kubenswrapper[7642]: I0319 11:54:03.803706 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:03.817583 master-0 kubenswrapper[7642]: I0319 11:54:03.817516 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 11:54:03.842979 master-0 kubenswrapper[7642]: I0319 11:54:03.842291 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 11:54:03.861548 master-0 kubenswrapper[7642]: I0319 11:54:03.861488 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 11:54:03.879620 master-0 kubenswrapper[7642]: I0319 11:54:03.879571 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 11:54:03.897654 master-0 kubenswrapper[7642]: I0319 11:54:03.897605 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 11:54:03.919608 master-0 kubenswrapper[7642]: I0319 11:54:03.919544 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:03.942909 master-0 kubenswrapper[7642]: I0319 11:54:03.942864 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:03.949228 master-0 kubenswrapper[7642]: I0319 11:54:03.949197 7642 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 11:54:03.949440 master-0 kubenswrapper[7642]: I0319 11:54:03.949276 7642 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 11:54:03.999877 master-0 kubenswrapper[7642]: I0319 11:54:03.999800 7642 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 11:54:04.007491 master-0 kubenswrapper[7642]: I0319 11:54:04.007124 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:04.047267 master-0 kubenswrapper[7642]: E0319 11:54:04.047099 7642 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263" Mar 19 11:54:04.047882 master-0 kubenswrapper[7642]: E0319 11:54:04.047419 7642 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cvpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-b865698dc-czzvw_openshift-service-ca-operator(3f3c0234-248a-4d6f-9aca-6582a481a911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:54:04.048722 master-0 kubenswrapper[7642]: E0319 11:54:04.048653 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" podUID="3f3c0234-248a-4d6f-9aca-6582a481a911" Mar 19 11:54:04.058382 master-0 kubenswrapper[7642]: I0319 11:54:04.058316 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:04.244507 master-0 kubenswrapper[7642]: I0319 11:54:04.244415 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:04.663169 master-0 kubenswrapper[7642]: E0319 11:54:04.662691 7642 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69" Mar 19 11:54:04.663271 master-0 kubenswrapper[7642]: E0319 11:54:04.663169 7642 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-config-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69,Command:[cluster-config-operator operator --operator-version=$(OPERATOR_IMAGE_VERSION) --authoritative-feature-gate-dir=/available-featuregates],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bssz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:1,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:1,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-95bf4f4d-9bqjr_openshift-config-operator(4705c8f3-89d4-42f2-ad14-e591c5380b9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:54:04.664898 master-0 kubenswrapper[7642]: E0319 11:54:04.664579 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" podUID="4705c8f3-89d4-42f2-ad14-e591c5380b9e" Mar 19 11:54:04.694524 master-0 kubenswrapper[7642]: I0319 11:54:04.694433 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:04.694524 master-0 kubenswrapper[7642]: I0319 11:54:04.694530 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:04.694847 master-0 kubenswrapper[7642]: E0319 11:54:04.694819 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:04.694941 master-0 kubenswrapper[7642]: E0319 11:54:04.694882 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.694859836 +0000 UTC m=+4.812300633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:04.695000 master-0 kubenswrapper[7642]: E0319 11:54:04.694959 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:04.695149 master-0 kubenswrapper[7642]: E0319 11:54:04.695101 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.695064511 +0000 UTC m=+4.812505298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:04.796318 master-0 kubenswrapper[7642]: I0319 11:54:04.796246 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: I0319 11:54:04.796442 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: I0319 11:54:04.796492 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: E0319 11:54:04.796494 7642 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: E0319 11:54:04.796597 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: I0319 11:54:04.796517 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: E0319 11:54:04.796669 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: E0319 11:54:04.796676 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.796624962 +0000 UTC m=+4.914065749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:04.796734 master-0 kubenswrapper[7642]: E0319 11:54:04.796738 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.796719665 +0000 UTC m=+4.914160452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: E0319 11:54:04.796759 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.796748566 +0000 UTC m=+4.914189353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: I0319 11:54:04.796812 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: E0319 11:54:04.796843 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: I0319 11:54:04.796880 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: I0319 11:54:04.796940 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: E0319 11:54:04.796973 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.796938811 +0000 UTC m=+4.914379778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: E0319 11:54:04.797016 7642 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: I0319 11:54:04.797016 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: E0319 11:54:04.797046 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797036374 +0000 UTC m=+4.914477161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: I0319 11:54:04.797071 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:04.797082 master-0 kubenswrapper[7642]: E0319 11:54:04.797095 7642 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: I0319 11:54:04.797117 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797145 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797137446 +0000 UTC m=+4.914578233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: I0319 11:54:04.797163 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797192 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: I0319 11:54:04.797203 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797214 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797208208 +0000 UTC m=+4.914648995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: I0319 11:54:04.797239 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797265 7642 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797296 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.79728067 +0000 UTC m=+4.914721457 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797341 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797363 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797355252 +0000 UTC m=+4.914796029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797405 7642 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797408 7642 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797437 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797428224 +0000 UTC m=+4.914869011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797472 7642 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797472 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797463035 +0000 UTC m=+4.914904012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797498 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797491496 +0000 UTC m=+4.914932283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797496 7642 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:04.797624 master-0 kubenswrapper[7642]: E0319 11:54:04.797546 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:06.797530957 +0000 UTC m=+4.914971754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:04.902922 master-0 kubenswrapper[7642]: I0319 11:54:04.902856 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:04.940203 master-0 kubenswrapper[7642]: I0319 11:54:04.940070 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:05.170988 master-0 kubenswrapper[7642]: I0319 11:54:05.170937 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:05.170988 master-0 kubenswrapper[7642]: I0319 11:54:05.170977 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:05.423557 master-0 kubenswrapper[7642]: E0319 11:54:05.423442 7642 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a" Mar 19 11:54:05.424353 master-0 kubenswrapper[7642]: E0319 11:54:05.423808 7642 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-scp9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-8544cbcf9c-bj75g_openshift-etcd-operator(aab47a39-a18b-4350-b141-4d5de2d8e96f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:54:05.425503 master-0 kubenswrapper[7642]: E0319 11:54:05.425412 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" podUID="aab47a39-a18b-4350-b141-4d5de2d8e96f" Mar 19 11:54:05.575528 master-0 kubenswrapper[7642]: I0319 11:54:05.575438 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:05.581871 master-0 kubenswrapper[7642]: I0319 11:54:05.581820 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:06.003416 master-0 kubenswrapper[7642]: E0319 11:54:06.003343 7642 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023" Mar 19 11:54:06.003607 master-0 kubenswrapper[7642]: E0319 11:54:06.003536 7642 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:copy-operator-controller-manifests,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023,Command:[/bin/sh],Args:[-c cp -a /openshift/manifests /operand-assets/operator-controller],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:operand-assets,ReadOnly:false,MountPath:/operand-assets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f7dp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000380000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-olm-operator-67dcd4998-54pv6_openshift-cluster-olm-operator(50adfdd3-1d4d-47d2-a35b-cd028d66b4c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:54:06.005169 master-0 kubenswrapper[7642]: E0319 11:54:06.005128 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"copy-operator-controller-manifests\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" podUID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" Mar 19 11:54:06.204695 master-0 kubenswrapper[7642]: I0319 11:54:06.204115 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tz4qg"] Mar 19 11:54:06.589053 master-0 kubenswrapper[7642]: I0319 11:54:06.588843 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:06.594381 master-0 kubenswrapper[7642]: I0319 11:54:06.594324 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 11:54:06.723374 master-0 kubenswrapper[7642]: I0319 11:54:06.723263 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:06.723660 master-0 kubenswrapper[7642]: E0319 11:54:06.723517 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:06.723837 master-0 kubenswrapper[7642]: E0319 11:54:06.723766 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.72373417 +0000 UTC m=+8.841175137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:06.724126 master-0 kubenswrapper[7642]: I0319 11:54:06.723871 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:06.724126 master-0 kubenswrapper[7642]: E0319 11:54:06.724037 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:06.724126 master-0 kubenswrapper[7642]: E0319 11:54:06.724104 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.72409548 +0000 UTC m=+8.841536467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:06.825214 master-0 kubenswrapper[7642]: I0319 11:54:06.825086 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:06.825545 master-0 kubenswrapper[7642]: E0319 11:54:06.825326 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:06.825545 master-0 kubenswrapper[7642]: E0319 11:54:06.825436 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.825413023 +0000 UTC m=+8.942853810 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:06.825545 master-0 kubenswrapper[7642]: I0319 11:54:06.825481 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:06.825545 master-0 kubenswrapper[7642]: I0319 11:54:06.825535 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:06.825758 master-0 kubenswrapper[7642]: I0319 11:54:06.825567 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:06.825758 master-0 kubenswrapper[7642]: I0319 11:54:06.825593 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:06.826384 master-0 kubenswrapper[7642]: I0319 11:54:06.826333 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:06.826464 master-0 kubenswrapper[7642]: I0319 11:54:06.826413 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:06.826464 master-0 kubenswrapper[7642]: I0319 11:54:06.826454 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:06.826550 master-0 kubenswrapper[7642]: I0319 11:54:06.826483 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:06.826550 master-0 kubenswrapper[7642]: E0319 11:54:06.826520 7642 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:06.826550 master-0 kubenswrapper[7642]: I0319 11:54:06.826531 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:06.826728 master-0 kubenswrapper[7642]: E0319 11:54:06.826607 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.826585545 +0000 UTC m=+8.944026332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:06.826728 master-0 kubenswrapper[7642]: I0319 11:54:06.826655 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:06.826728 master-0 kubenswrapper[7642]: E0319 11:54:06.826693 7642 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:06.826864 master-0 kubenswrapper[7642]: E0319 11:54:06.826737 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.826720259 +0000 UTC m=+8.944161046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:06.826864 master-0 kubenswrapper[7642]: E0319 11:54:06.826810 7642 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:06.826864 master-0 kubenswrapper[7642]: E0319 11:54:06.826841 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.826832832 +0000 UTC m=+8.944273619 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:06.826864 master-0 kubenswrapper[7642]: E0319 11:54:06.826801 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.826875 7642 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.826882 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.826875843 +0000 UTC m=+8.944316630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.826908 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.826895474 +0000 UTC m=+8.944336251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.826954 7642 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.826981 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.826973866 +0000 UTC m=+8.944414653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.826956 7642 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.827006 7642 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.827014 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.827007627 +0000 UTC m=+8.944448414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.827041 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.827030438 +0000 UTC m=+8.944471225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:06.827083 master-0 kubenswrapper[7642]: E0319 11:54:06.827074 7642 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827108 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.827092459 +0000 UTC m=+8.944533246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: I0319 11:54:06.827212 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: I0319 11:54:06.827242 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827319 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827354 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.827344936 +0000 UTC m=+8.944785723 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827363 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827404 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.827388607 +0000 UTC m=+8.944829394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827421 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:06.827618 master-0 kubenswrapper[7642]: E0319 11:54:06.827452 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.827445739 +0000 UTC m=+8.944886526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:07.075352 master-0 kubenswrapper[7642]: I0319 11:54:07.074903 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:07.182645 master-0 kubenswrapper[7642]: I0319 11:54:07.182552 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"d611dd027fd146af69473e5540a6fda94c8c6de39cd053174307e17bd337e0ce"} Mar 19 11:54:07.184468 master-0 kubenswrapper[7642]: I0319 11:54:07.184390 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"dd2920bde9ceef973c55e7d085b8480289be3211f9caba169d896a74981242f4"} Mar 19 11:54:07.185737 master-0 kubenswrapper[7642]: I0319 11:54:07.185671 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerStarted","Data":"a82bb27ad062e4cf63ae115331a67a108d2b42bc563ab5c88f2c0b48ab45fdcd"} Mar 19 11:54:07.186900 master-0 kubenswrapper[7642]: I0319 11:54:07.186842 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerStarted","Data":"15fc833de80b2cae87e3ef1229c5aae30a9051318a9e0831bf339fc664c1dd1c"} Mar 19 11:54:07.188006 master-0 kubenswrapper[7642]: I0319 11:54:07.187964 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerStarted","Data":"1bd7cb8f485e09b49dad795958f587384c11e7992bcea8c49f1cbb606f9399d9"} Mar 19 11:54:07.189449 master-0 kubenswrapper[7642]: I0319 11:54:07.189374 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tz4qg" event={"ID":"65b629a3-ea10-4517-b762-b18b22fab2a9","Type":"ContainerStarted","Data":"a1b2ed2ffd2327cf8b909688fcb33a183368625a173ed7a0fd40651897a3bae2"} Mar 19 11:54:07.189449 master-0 kubenswrapper[7642]: I0319 11:54:07.189450 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tz4qg" event={"ID":"65b629a3-ea10-4517-b762-b18b22fab2a9","Type":"ContainerStarted","Data":"ef597ecc3cb239a3f4a10a1a4b99299b44a320ef76f5f894771c1233b3937cda"} Mar 19 11:54:07.190487 master-0 kubenswrapper[7642]: I0319 11:54:07.190438 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" event={"ID":"99ffcc9a-4b01-439a-93e0-0674bdfd32ec","Type":"ContainerStarted","Data":"a148f6c3b7285c6b23a088afec6a5093a43e8ac079cf43adfcb33eec79989b03"} Mar 19 11:54:07.195332 master-0 kubenswrapper[7642]: I0319 11:54:07.195253 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerStarted","Data":"57848ee43bd88e9ed1ccc7c15c3a6279821a813bcb2340426181dccb4cafcc2a"} Mar 19 11:54:07.602179 master-0 kubenswrapper[7642]: I0319 11:54:07.602116 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw"] Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: E0319 11:54:07.602324 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6568c544-a539-4d50-89f0-1705a5f63411" containerName="prober" Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: I0319 11:54:07.602337 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="6568c544-a539-4d50-89f0-1705a5f63411" containerName="prober" Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: E0319 11:54:07.602348 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: I0319 11:54:07.602355 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: I0319 11:54:07.602426 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="6568c544-a539-4d50-89f0-1705a5f63411" containerName="prober" Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: I0319 11:54:07.602435 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 11:54:07.603246 master-0 kubenswrapper[7642]: I0319 11:54:07.602788 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 11:54:07.614592 master-0 kubenswrapper[7642]: I0319 11:54:07.614523 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw"] Mar 19 11:54:07.694824 master-0 kubenswrapper[7642]: I0319 11:54:07.694739 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-52lv2"] Mar 19 11:54:07.695347 master-0 kubenswrapper[7642]: I0319 11:54:07.695323 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 11:54:07.703624 master-0 kubenswrapper[7642]: I0319 11:54:07.703425 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 11:54:07.704758 master-0 kubenswrapper[7642]: I0319 11:54:07.704731 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 11:54:07.710742 master-0 kubenswrapper[7642]: I0319 11:54:07.710696 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-52lv2"] Mar 19 11:54:07.738223 master-0 kubenswrapper[7642]: I0319 11:54:07.738155 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxjkt\" (UniqueName: \"kubernetes.io/projected/0c804ea0-3483-4506-b138-3bf30016d8d8-kube-api-access-pxjkt\") pod \"csi-snapshot-controller-64854d9cff-6dlpw\" (UID: \"0c804ea0-3483-4506-b138-3bf30016d8d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 11:54:07.839387 master-0 kubenswrapper[7642]: I0319 11:54:07.839301 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjkt\" (UniqueName: \"kubernetes.io/projected/0c804ea0-3483-4506-b138-3bf30016d8d8-kube-api-access-pxjkt\") pod \"csi-snapshot-controller-64854d9cff-6dlpw\" (UID: \"0c804ea0-3483-4506-b138-3bf30016d8d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 11:54:07.839856 master-0 kubenswrapper[7642]: I0319 11:54:07.839804 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5cs\" (UniqueName: \"kubernetes.io/projected/4d790b42-2182-4b9c-8391-285e938715cc-kube-api-access-zh5cs\") pod \"migrator-8487694857-52lv2\" (UID: \"4d790b42-2182-4b9c-8391-285e938715cc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 11:54:07.865165 master-0 kubenswrapper[7642]: I0319 11:54:07.865034 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjkt\" (UniqueName: \"kubernetes.io/projected/0c804ea0-3483-4506-b138-3bf30016d8d8-kube-api-access-pxjkt\") pod \"csi-snapshot-controller-64854d9cff-6dlpw\" (UID: \"0c804ea0-3483-4506-b138-3bf30016d8d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 11:54:07.921209 master-0 kubenswrapper[7642]: I0319 11:54:07.921123 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 11:54:07.942717 master-0 kubenswrapper[7642]: I0319 11:54:07.941356 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5cs\" (UniqueName: \"kubernetes.io/projected/4d790b42-2182-4b9c-8391-285e938715cc-kube-api-access-zh5cs\") pod \"migrator-8487694857-52lv2\" (UID: \"4d790b42-2182-4b9c-8391-285e938715cc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 11:54:07.967409 master-0 kubenswrapper[7642]: I0319 11:54:07.967339 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5cs\" (UniqueName: \"kubernetes.io/projected/4d790b42-2182-4b9c-8391-285e938715cc-kube-api-access-zh5cs\") pod \"migrator-8487694857-52lv2\" (UID: \"4d790b42-2182-4b9c-8391-285e938715cc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 11:54:08.014660 master-0 kubenswrapper[7642]: I0319 11:54:08.012155 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 11:54:08.150890 master-0 kubenswrapper[7642]: I0319 11:54:08.150511 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw"] Mar 19 11:54:08.160821 master-0 kubenswrapper[7642]: W0319 11:54:08.160756 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c804ea0_3483_4506_b138_3bf30016d8d8.slice/crio-731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855 WatchSource:0}: Error finding container 731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855: Status 404 returned error can't find the container with id 731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855 Mar 19 11:54:08.208024 master-0 kubenswrapper[7642]: I0319 11:54:08.207907 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855"} Mar 19 11:54:08.208024 master-0 kubenswrapper[7642]: I0319 11:54:08.207955 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:08.236975 master-0 kubenswrapper[7642]: I0319 11:54:08.236896 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-8487694857-52lv2"] Mar 19 11:54:08.458449 master-0 kubenswrapper[7642]: I0319 11:54:08.458388 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-wt47m"] Mar 19 11:54:08.459137 master-0 kubenswrapper[7642]: I0319 11:54:08.459111 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.461978 master-0 kubenswrapper[7642]: I0319 11:54:08.461918 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:54:08.462056 master-0 kubenswrapper[7642]: I0319 11:54:08.461917 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:08.462254 master-0 kubenswrapper[7642]: I0319 11:54:08.461926 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:54:08.462499 master-0 kubenswrapper[7642]: I0319 11:54:08.462332 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:08.462722 master-0 kubenswrapper[7642]: I0319 11:54:08.462692 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:54:08.463910 master-0 kubenswrapper[7642]: I0319 11:54:08.463850 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:54:08.469735 master-0 kubenswrapper[7642]: I0319 11:54:08.469695 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-wt47m"] Mar 19 11:54:08.550055 master-0 kubenswrapper[7642]: I0319 11:54:08.549863 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.550055 master-0 kubenswrapper[7642]: I0319 11:54:08.549948 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.550055 master-0 kubenswrapper[7642]: I0319 11:54:08.550010 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.550055 master-0 kubenswrapper[7642]: I0319 11:54:08.550068 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hphv4\" (UniqueName: \"kubernetes.io/projected/3720eb7a-4be6-4e1e-b147-337179eace81-kube-api-access-hphv4\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.550466 master-0 kubenswrapper[7642]: I0319 11:54:08.550106 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.651464 master-0 kubenswrapper[7642]: I0319 11:54:08.651369 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.652281 master-0 kubenswrapper[7642]: E0319 11:54:08.651597 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:08.652281 master-0 kubenswrapper[7642]: E0319 11:54:08.651745 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:09.15171693 +0000 UTC m=+7.269157717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "client-ca" not found Mar 19 11:54:08.652281 master-0 kubenswrapper[7642]: I0319 11:54:08.652109 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.652281 master-0 kubenswrapper[7642]: I0319 11:54:08.652149 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.652281 master-0 kubenswrapper[7642]: I0319 11:54:08.652260 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hphv4\" (UniqueName: \"kubernetes.io/projected/3720eb7a-4be6-4e1e-b147-337179eace81-kube-api-access-hphv4\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.652506 master-0 kubenswrapper[7642]: I0319 11:54:08.652329 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:08.652506 master-0 kubenswrapper[7642]: E0319 11:54:08.652438 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 19 11:54:08.652506 master-0 kubenswrapper[7642]: E0319 11:54:08.652461 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:09.152454341 +0000 UTC m=+7.269895128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "config" not found Mar 19 11:54:08.652506 master-0 kubenswrapper[7642]: E0319 11:54:08.652491 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 19 11:54:08.652506 master-0 kubenswrapper[7642]: E0319 11:54:08.652507 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:09.152502022 +0000 UTC m=+7.269942799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "openshift-global-ca" not found Mar 19 11:54:08.652718 master-0 kubenswrapper[7642]: E0319 11:54:08.652570 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:08.652718 master-0 kubenswrapper[7642]: E0319 11:54:08.652590 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:09.152585064 +0000 UTC m=+7.270025851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : secret "serving-cert" not found Mar 19 11:54:08.677453 master-0 kubenswrapper[7642]: I0319 11:54:08.677354 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hphv4\" (UniqueName: \"kubernetes.io/projected/3720eb7a-4be6-4e1e-b147-337179eace81-kube-api-access-hphv4\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: I0319 11:54:09.160789 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: I0319 11:54:09.160886 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: I0319 11:54:09.160911 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: I0319 11:54:09.161020 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161172 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161238 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.161215731 +0000 UTC m=+8.278656518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "config" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161658 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161686 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.161677784 +0000 UTC m=+8.279118571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "client-ca" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161751 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161774 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.161767496 +0000 UTC m=+8.279208293 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : secret "serving-cert" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161828 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 19 11:54:09.164101 master-0 kubenswrapper[7642]: E0319 11:54:09.161855 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.161841078 +0000 UTC m=+8.279281865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "openshift-global-ca" not found Mar 19 11:54:09.213214 master-0 kubenswrapper[7642]: I0319 11:54:09.213158 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" event={"ID":"4d790b42-2182-4b9c-8391-285e938715cc","Type":"ContainerStarted","Data":"6b5887fbc5e32ebc2f94544f4a59e3db7e9426484412d0d7e1662fd8761f75d2"} Mar 19 11:54:09.560756 master-0 kubenswrapper[7642]: I0319 11:54:09.560690 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:09.566861 master-0 kubenswrapper[7642]: I0319 11:54:09.566829 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:09.695055 master-0 kubenswrapper[7642]: I0319 11:54:09.694975 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-wt47m"] Mar 19 11:54:09.696052 master-0 kubenswrapper[7642]: E0319 11:54:09.695876 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" podUID="3720eb7a-4be6-4e1e-b147-337179eace81" Mar 19 11:54:09.702422 master-0 kubenswrapper[7642]: I0319 11:54:09.702370 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp"] Mar 19 11:54:09.703515 master-0 kubenswrapper[7642]: I0319 11:54:09.703497 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.707228 master-0 kubenswrapper[7642]: I0319 11:54:09.707194 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:54:09.707290 master-0 kubenswrapper[7642]: I0319 11:54:09.707239 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:09.707421 master-0 kubenswrapper[7642]: I0319 11:54:09.707365 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:54:09.707697 master-0 kubenswrapper[7642]: I0319 11:54:09.707666 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:09.710749 master-0 kubenswrapper[7642]: I0319 11:54:09.710697 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:54:09.721400 master-0 kubenswrapper[7642]: I0319 11:54:09.718916 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp"] Mar 19 11:54:09.770030 master-0 kubenswrapper[7642]: I0319 11:54:09.769966 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.770030 master-0 kubenswrapper[7642]: I0319 11:54:09.770014 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8224\" (UniqueName: \"kubernetes.io/projected/9c3ceff4-e4f6-4149-b31b-bbce53d704be-kube-api-access-m8224\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.770283 master-0 kubenswrapper[7642]: I0319 11:54:09.770252 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.770570 master-0 kubenswrapper[7642]: I0319 11:54:09.770511 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-config\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.871781 master-0 kubenswrapper[7642]: I0319 11:54:09.871629 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8224\" (UniqueName: \"kubernetes.io/projected/9c3ceff4-e4f6-4149-b31b-bbce53d704be-kube-api-access-m8224\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.871781 master-0 kubenswrapper[7642]: I0319 11:54:09.871781 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.872392 master-0 kubenswrapper[7642]: E0319 11:54:09.872014 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:09.872392 master-0 kubenswrapper[7642]: E0319 11:54:09.872112 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.372087595 +0000 UTC m=+8.489528382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : secret "serving-cert" not found Mar 19 11:54:09.872606 master-0 kubenswrapper[7642]: I0319 11:54:09.872543 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.872865 master-0 kubenswrapper[7642]: I0319 11:54:09.872805 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-config\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.873618 master-0 kubenswrapper[7642]: E0319 11:54:09.872687 7642 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:09.873900 master-0 kubenswrapper[7642]: E0319 11:54:09.873741 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:10.37369802 +0000 UTC m=+8.491138807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : configmap "client-ca" not found Mar 19 11:54:09.874269 master-0 kubenswrapper[7642]: I0319 11:54:09.874216 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-config\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:09.890222 master-0 kubenswrapper[7642]: I0319 11:54:09.890144 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8224\" (UniqueName: \"kubernetes.io/projected/9c3ceff4-e4f6-4149-b31b-bbce53d704be-kube-api-access-m8224\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:10.186291 master-0 kubenswrapper[7642]: I0319 11:54:10.186245 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.186653 master-0 kubenswrapper[7642]: I0319 11:54:10.186583 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.186836 master-0 kubenswrapper[7642]: E0319 11:54:10.186781 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:10.186909 master-0 kubenswrapper[7642]: I0319 11:54:10.186862 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.186909 master-0 kubenswrapper[7642]: E0319 11:54:10.186891 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:12.186868386 +0000 UTC m=+10.304309173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : configmap "client-ca" not found Mar 19 11:54:10.187035 master-0 kubenswrapper[7642]: I0319 11:54:10.186928 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.187093 master-0 kubenswrapper[7642]: E0319 11:54:10.187079 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:10.187191 master-0 kubenswrapper[7642]: E0319 11:54:10.187165 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert podName:3720eb7a-4be6-4e1e-b147-337179eace81 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:12.187140773 +0000 UTC m=+10.304581730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert") pod "controller-manager-f5df8899c-wt47m" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81") : secret "serving-cert" not found Mar 19 11:54:10.187264 master-0 kubenswrapper[7642]: I0319 11:54:10.187228 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.188530 master-0 kubenswrapper[7642]: I0319 11:54:10.188485 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") pod \"controller-manager-f5df8899c-wt47m\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.221106 master-0 kubenswrapper[7642]: I0319 11:54:10.221046 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j6n2j" event={"ID":"64136215-8539-4349-977f-6d59deb132ea","Type":"ContainerStarted","Data":"eaa4320256fd1e5782890c1a2e18f3c707f378b1f8c9a5b0ab85a3c02fed3649"} Mar 19 11:54:10.221207 master-0 kubenswrapper[7642]: I0319 11:54:10.221064 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.338981 master-0 kubenswrapper[7642]: I0319 11:54:10.338926 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:10.390599 master-0 kubenswrapper[7642]: I0319 11:54:10.390521 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:10.390937 master-0 kubenswrapper[7642]: E0319 11:54:10.390819 7642 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:10.390937 master-0 kubenswrapper[7642]: I0319 11:54:10.390876 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:10.390937 master-0 kubenswrapper[7642]: E0319 11:54:10.390931 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:11.390904133 +0000 UTC m=+9.508344990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : configmap "client-ca" not found Mar 19 11:54:10.391082 master-0 kubenswrapper[7642]: E0319 11:54:10.391045 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:10.391157 master-0 kubenswrapper[7642]: E0319 11:54:10.391127 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:11.391103588 +0000 UTC m=+9.508544445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : secret "serving-cert" not found Mar 19 11:54:10.492755 master-0 kubenswrapper[7642]: I0319 11:54:10.492318 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") pod \"3720eb7a-4be6-4e1e-b147-337179eace81\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " Mar 19 11:54:10.492755 master-0 kubenswrapper[7642]: I0319 11:54:10.492393 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hphv4\" (UniqueName: \"kubernetes.io/projected/3720eb7a-4be6-4e1e-b147-337179eace81-kube-api-access-hphv4\") pod \"3720eb7a-4be6-4e1e-b147-337179eace81\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " Mar 19 11:54:10.492755 master-0 kubenswrapper[7642]: I0319 11:54:10.492461 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") pod \"3720eb7a-4be6-4e1e-b147-337179eace81\" (UID: \"3720eb7a-4be6-4e1e-b147-337179eace81\") " Mar 19 11:54:10.493224 master-0 kubenswrapper[7642]: I0319 11:54:10.492828 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3720eb7a-4be6-4e1e-b147-337179eace81" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:10.493224 master-0 kubenswrapper[7642]: I0319 11:54:10.492973 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config" (OuterVolumeSpecName: "config") pod "3720eb7a-4be6-4e1e-b147-337179eace81" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:10.493224 master-0 kubenswrapper[7642]: I0319 11:54:10.493007 7642 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:10.497412 master-0 kubenswrapper[7642]: I0319 11:54:10.497101 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3720eb7a-4be6-4e1e-b147-337179eace81-kube-api-access-hphv4" (OuterVolumeSpecName: "kube-api-access-hphv4") pod "3720eb7a-4be6-4e1e-b147-337179eace81" (UID: "3720eb7a-4be6-4e1e-b147-337179eace81"). InnerVolumeSpecName "kube-api-access-hphv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:10.594962 master-0 kubenswrapper[7642]: I0319 11:54:10.594386 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hphv4\" (UniqueName: \"kubernetes.io/projected/3720eb7a-4be6-4e1e-b147-337179eace81-kube-api-access-hphv4\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:10.595116 master-0 kubenswrapper[7642]: I0319 11:54:10.594977 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:10.797933 master-0 kubenswrapper[7642]: I0319 11:54:10.797747 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:10.798928 master-0 kubenswrapper[7642]: I0319 11:54:10.798048 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:10.798928 master-0 kubenswrapper[7642]: E0319 11:54:10.798235 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:10.798928 master-0 kubenswrapper[7642]: E0319 11:54:10.798322 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.798293479 +0000 UTC m=+16.915734276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:10.798928 master-0 kubenswrapper[7642]: E0319 11:54:10.798842 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:10.798928 master-0 kubenswrapper[7642]: E0319 11:54:10.798877 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.798865895 +0000 UTC m=+16.916306702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:10.898961 master-0 kubenswrapper[7642]: I0319 11:54:10.898867 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:10.898961 master-0 kubenswrapper[7642]: I0319 11:54:10.898946 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899006 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899048 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899094 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899123 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899152 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899214 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899243 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899268 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899292 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:10.899320 master-0 kubenswrapper[7642]: I0319 11:54:10.899320 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:10.899743 master-0 kubenswrapper[7642]: I0319 11:54:10.899356 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:10.899743 master-0 kubenswrapper[7642]: E0319 11:54:10.899512 7642 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:10.899743 master-0 kubenswrapper[7642]: E0319 11:54:10.899586 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.899561477 +0000 UTC m=+17.017002264 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:10.899743 master-0 kubenswrapper[7642]: E0319 11:54:10.899673 7642 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:10.899743 master-0 kubenswrapper[7642]: E0319 11:54:10.899701 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.899692521 +0000 UTC m=+17.017133308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:10.899743 master-0 kubenswrapper[7642]: E0319 11:54:10.899746 7642 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:10.899936 master-0 kubenswrapper[7642]: E0319 11:54:10.899775 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.899765923 +0000 UTC m=+17.017206710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:10.900185 master-0 kubenswrapper[7642]: E0319 11:54:10.900114 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:10.900303 master-0 kubenswrapper[7642]: E0319 11:54:10.900279 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900221765 +0000 UTC m=+17.017662552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:10.900344 master-0 kubenswrapper[7642]: E0319 11:54:10.900276 7642 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:10.900401 master-0 kubenswrapper[7642]: E0319 11:54:10.900352 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:10.900439 master-0 kubenswrapper[7642]: E0319 11:54:10.900403 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.90037693 +0000 UTC m=+17.017817717 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:10.900439 master-0 kubenswrapper[7642]: E0319 11:54:10.900422 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:10.900498 master-0 kubenswrapper[7642]: E0319 11:54:10.900477 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900446862 +0000 UTC m=+17.017887819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:10.900535 master-0 kubenswrapper[7642]: E0319 11:54:10.900503 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900493943 +0000 UTC m=+17.017934950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:10.900535 master-0 kubenswrapper[7642]: E0319 11:54:10.900513 7642 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:54:10.900594 master-0 kubenswrapper[7642]: E0319 11:54:10.900555 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900543714 +0000 UTC m=+17.017984501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:10.900659 master-0 kubenswrapper[7642]: E0319 11:54:10.900624 7642 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:10.900700 master-0 kubenswrapper[7642]: E0319 11:54:10.900678 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900670228 +0000 UTC m=+17.018111015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:10.900782 master-0 kubenswrapper[7642]: E0319 11:54:10.900747 7642 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:10.900819 master-0 kubenswrapper[7642]: E0319 11:54:10.900794 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900782631 +0000 UTC m=+17.018223598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:10.900855 master-0 kubenswrapper[7642]: E0319 11:54:10.900833 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:10.900887 master-0 kubenswrapper[7642]: E0319 11:54:10.900860 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900853603 +0000 UTC m=+17.018294390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:10.900921 master-0 kubenswrapper[7642]: E0319 11:54:10.900911 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:10.900958 master-0 kubenswrapper[7642]: E0319 11:54:10.900938 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.900929425 +0000 UTC m=+17.018370202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:10.900992 master-0 kubenswrapper[7642]: E0319 11:54:10.900982 7642 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:10.901035 master-0 kubenswrapper[7642]: E0319 11:54:10.901018 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:18.901005127 +0000 UTC m=+17.018445914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:11.145408 master-0 kubenswrapper[7642]: I0319 11:54:11.143590 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:11.145408 master-0 kubenswrapper[7642]: I0319 11:54:11.143780 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:11.145408 master-0 kubenswrapper[7642]: I0319 11:54:11.143792 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:11.161427 master-0 kubenswrapper[7642]: I0319 11:54:11.161386 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:11.226623 master-0 kubenswrapper[7642]: I0319 11:54:11.226555 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"53038deea78787ba444b0934c52753748ca5659dc5194e58a710a65971f4c475"} Mar 19 11:54:11.229100 master-0 kubenswrapper[7642]: I0319 11:54:11.229032 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" event={"ID":"4d790b42-2182-4b9c-8391-285e938715cc","Type":"ContainerStarted","Data":"7d607bc604afb38bbd70378afd53bf1d6787e2bd2b574ea5a541d63fd9c62121"} Mar 19 11:54:11.229100 master-0 kubenswrapper[7642]: I0319 11:54:11.229076 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f5df8899c-wt47m" Mar 19 11:54:11.229100 master-0 kubenswrapper[7642]: I0319 11:54:11.229101 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" event={"ID":"4d790b42-2182-4b9c-8391-285e938715cc","Type":"ContainerStarted","Data":"c0cb9d988f8fe88f88f11a1292927add52343057f8dfc3f622fbc5e70249c9ae"} Mar 19 11:54:11.229412 master-0 kubenswrapper[7642]: I0319 11:54:11.229152 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:11.242891 master-0 kubenswrapper[7642]: I0319 11:54:11.242809 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podStartSLOduration=2.224150622 podStartE2EDuration="4.242790231s" podCreationTimestamp="2026-03-19 11:54:07 +0000 UTC" firstStartedPulling="2026-03-19 11:54:08.162751394 +0000 UTC m=+6.280192181" lastFinishedPulling="2026-03-19 11:54:10.181391003 +0000 UTC m=+8.298831790" observedRunningTime="2026-03-19 11:54:11.240782805 +0000 UTC m=+9.358223602" watchObservedRunningTime="2026-03-19 11:54:11.242790231 +0000 UTC m=+9.360231018" Mar 19 11:54:11.262663 master-0 kubenswrapper[7642]: I0319 11:54:11.262555 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" podStartSLOduration=2.336786449 podStartE2EDuration="4.262515032s" podCreationTimestamp="2026-03-19 11:54:07 +0000 UTC" firstStartedPulling="2026-03-19 11:54:08.250194377 +0000 UTC m=+6.367635164" lastFinishedPulling="2026-03-19 11:54:10.17592296 +0000 UTC m=+8.293363747" observedRunningTime="2026-03-19 11:54:11.2617231 +0000 UTC m=+9.379163907" watchObservedRunningTime="2026-03-19 11:54:11.262515032 +0000 UTC m=+9.379955819" Mar 19 11:54:11.279104 master-0 kubenswrapper[7642]: I0319 11:54:11.279046 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:11.306226 master-0 kubenswrapper[7642]: I0319 11:54:11.306160 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 11:54:11.317754 master-0 kubenswrapper[7642]: I0319 11:54:11.317698 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5576c989bb-kw2h6"] Mar 19 11:54:11.318357 master-0 kubenswrapper[7642]: I0319 11:54:11.318183 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-wt47m"] Mar 19 11:54:11.318357 master-0 kubenswrapper[7642]: I0319 11:54:11.318280 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.322623 master-0 kubenswrapper[7642]: I0319 11:54:11.322578 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:54:11.323760 master-0 kubenswrapper[7642]: I0319 11:54:11.322743 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:11.323760 master-0 kubenswrapper[7642]: I0319 11:54:11.323174 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:54:11.323760 master-0 kubenswrapper[7642]: I0319 11:54:11.323531 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:54:11.323760 master-0 kubenswrapper[7642]: I0319 11:54:11.323665 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:11.326674 master-0 kubenswrapper[7642]: I0319 11:54:11.326615 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:11.326843 master-0 kubenswrapper[7642]: I0319 11:54:11.326761 7642 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 11:54:11.328393 master-0 kubenswrapper[7642]: I0319 11:54:11.328354 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f5df8899c-wt47m"] Mar 19 11:54:11.329964 master-0 kubenswrapper[7642]: I0319 11:54:11.329936 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:54:11.332489 master-0 kubenswrapper[7642]: I0319 11:54:11.332348 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:11.332489 master-0 kubenswrapper[7642]: I0319 11:54:11.332394 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5576c989bb-kw2h6"] Mar 19 11:54:11.406646 master-0 kubenswrapper[7642]: I0319 11:54:11.406568 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwxww\" (UniqueName: \"kubernetes.io/projected/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-kube-api-access-wwxww\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.406874 master-0 kubenswrapper[7642]: I0319 11:54:11.406710 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-config\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.406874 master-0 kubenswrapper[7642]: I0319 11:54:11.406747 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-proxy-ca-bundles\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.406874 master-0 kubenswrapper[7642]: I0319 11:54:11.406784 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.407063 master-0 kubenswrapper[7642]: I0319 11:54:11.407003 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:11.407145 master-0 kubenswrapper[7642]: I0319 11:54:11.407105 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.407377 master-0 kubenswrapper[7642]: E0319 11:54:11.407346 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:11.407524 master-0 kubenswrapper[7642]: I0319 11:54:11.407490 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:11.407571 master-0 kubenswrapper[7642]: E0319 11:54:11.407507 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:13.407483091 +0000 UTC m=+11.524923878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : secret "serving-cert" not found Mar 19 11:54:11.407613 master-0 kubenswrapper[7642]: I0319 11:54:11.407572 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3720eb7a-4be6-4e1e-b147-337179eace81-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:11.407613 master-0 kubenswrapper[7642]: I0319 11:54:11.407589 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3720eb7a-4be6-4e1e-b147-337179eace81-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:11.407721 master-0 kubenswrapper[7642]: E0319 11:54:11.407697 7642 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:11.407777 master-0 kubenswrapper[7642]: E0319 11:54:11.407759 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:13.407741948 +0000 UTC m=+11.525182735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : configmap "client-ca" not found Mar 19 11:54:11.475002 master-0 kubenswrapper[7642]: I0319 11:54:11.474946 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5576c989bb-kw2h6"] Mar 19 11:54:11.475309 master-0 kubenswrapper[7642]: E0319 11:54:11.475273 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-wwxww proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" podUID="b5abcb72-3bf1-4263-9cf1-69d2c19a575c" Mar 19 11:54:11.509025 master-0 kubenswrapper[7642]: I0319 11:54:11.508965 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-proxy-ca-bundles\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.509351 master-0 kubenswrapper[7642]: I0319 11:54:11.509333 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.509501 master-0 kubenswrapper[7642]: I0319 11:54:11.509486 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.509566 master-0 kubenswrapper[7642]: E0319 11:54:11.509518 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:11.509716 master-0 kubenswrapper[7642]: E0319 11:54:11.509667 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:11.509771 master-0 kubenswrapper[7642]: E0319 11:54:11.509689 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert podName:b5abcb72-3bf1-4263-9cf1-69d2c19a575c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:12.009663854 +0000 UTC m=+10.127104641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert") pod "controller-manager-5576c989bb-kw2h6" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c") : secret "serving-cert" not found Mar 19 11:54:11.509900 master-0 kubenswrapper[7642]: E0319 11:54:11.509874 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca podName:b5abcb72-3bf1-4263-9cf1-69d2c19a575c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:12.009848169 +0000 UTC m=+10.127288956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca") pod "controller-manager-5576c989bb-kw2h6" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c") : configmap "client-ca" not found Mar 19 11:54:11.510012 master-0 kubenswrapper[7642]: I0319 11:54:11.509980 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwxww\" (UniqueName: \"kubernetes.io/projected/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-kube-api-access-wwxww\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.510206 master-0 kubenswrapper[7642]: I0319 11:54:11.510178 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-config\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.510984 master-0 kubenswrapper[7642]: I0319 11:54:11.510932 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-proxy-ca-bundles\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.511155 master-0 kubenswrapper[7642]: I0319 11:54:11.511135 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-config\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:11.532832 master-0 kubenswrapper[7642]: I0319 11:54:11.532759 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwxww\" (UniqueName: \"kubernetes.io/projected/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-kube-api-access-wwxww\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:12.017358 master-0 kubenswrapper[7642]: I0319 11:54:12.017297 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:12.018178 master-0 kubenswrapper[7642]: E0319 11:54:12.017497 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:12.018307 master-0 kubenswrapper[7642]: I0319 11:54:12.018269 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:12.018397 master-0 kubenswrapper[7642]: E0319 11:54:12.018309 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:12.018511 master-0 kubenswrapper[7642]: E0319 11:54:12.018376 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert podName:b5abcb72-3bf1-4263-9cf1-69d2c19a575c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:13.018337069 +0000 UTC m=+11.135777856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert") pod "controller-manager-5576c989bb-kw2h6" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c") : secret "serving-cert" not found Mar 19 11:54:12.018806 master-0 kubenswrapper[7642]: E0319 11:54:12.018730 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca podName:b5abcb72-3bf1-4263-9cf1-69d2c19a575c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:13.018667168 +0000 UTC m=+11.136107955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca") pod "controller-manager-5576c989bb-kw2h6" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c") : configmap "client-ca" not found Mar 19 11:54:12.079337 master-0 kubenswrapper[7642]: I0319 11:54:12.079227 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3720eb7a-4be6-4e1e-b147-337179eace81" path="/var/lib/kubelet/pods/3720eb7a-4be6-4e1e-b147-337179eace81/volumes" Mar 19 11:54:12.234052 master-0 kubenswrapper[7642]: I0319 11:54:12.232927 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:12.245519 master-0 kubenswrapper[7642]: I0319 11:54:12.245448 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:12.321817 master-0 kubenswrapper[7642]: I0319 11:54:12.321609 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwxww\" (UniqueName: \"kubernetes.io/projected/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-kube-api-access-wwxww\") pod \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " Mar 19 11:54:12.321817 master-0 kubenswrapper[7642]: I0319 11:54:12.321681 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-proxy-ca-bundles\") pod \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " Mar 19 11:54:12.321817 master-0 kubenswrapper[7642]: I0319 11:54:12.321733 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-config\") pod \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " Mar 19 11:54:12.322875 master-0 kubenswrapper[7642]: I0319 11:54:12.322816 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b5abcb72-3bf1-4263-9cf1-69d2c19a575c" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:12.323730 master-0 kubenswrapper[7642]: I0319 11:54:12.323695 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-config" (OuterVolumeSpecName: "config") pod "b5abcb72-3bf1-4263-9cf1-69d2c19a575c" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:12.325938 master-0 kubenswrapper[7642]: I0319 11:54:12.325862 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-kube-api-access-wwxww" (OuterVolumeSpecName: "kube-api-access-wwxww") pod "b5abcb72-3bf1-4263-9cf1-69d2c19a575c" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c"). InnerVolumeSpecName "kube-api-access-wwxww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:12.423543 master-0 kubenswrapper[7642]: I0319 11:54:12.423450 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwxww\" (UniqueName: \"kubernetes.io/projected/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-kube-api-access-wwxww\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:12.423543 master-0 kubenswrapper[7642]: I0319 11:54:12.423495 7642 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:12.423543 master-0 kubenswrapper[7642]: I0319 11:54:12.423505 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:13.030849 master-0 kubenswrapper[7642]: I0319 11:54:13.030758 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:13.031503 master-0 kubenswrapper[7642]: I0319 11:54:13.030911 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca\") pod \"controller-manager-5576c989bb-kw2h6\" (UID: \"b5abcb72-3bf1-4263-9cf1-69d2c19a575c\") " pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:13.031503 master-0 kubenswrapper[7642]: E0319 11:54:13.031183 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:13.031503 master-0 kubenswrapper[7642]: E0319 11:54:13.031263 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca podName:b5abcb72-3bf1-4263-9cf1-69d2c19a575c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:15.031240694 +0000 UTC m=+13.148681481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca") pod "controller-manager-5576c989bb-kw2h6" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c") : configmap "client-ca" not found Mar 19 11:54:13.031503 master-0 kubenswrapper[7642]: E0319 11:54:13.031302 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:13.031503 master-0 kubenswrapper[7642]: E0319 11:54:13.031358 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert podName:b5abcb72-3bf1-4263-9cf1-69d2c19a575c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:15.031341227 +0000 UTC m=+13.148782014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert") pod "controller-manager-5576c989bb-kw2h6" (UID: "b5abcb72-3bf1-4263-9cf1-69d2c19a575c") : secret "serving-cert" not found Mar 19 11:54:13.210549 master-0 kubenswrapper[7642]: I0319 11:54:13.210459 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:13.236402 master-0 kubenswrapper[7642]: I0319 11:54:13.236340 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5576c989bb-kw2h6" Mar 19 11:54:13.438225 master-0 kubenswrapper[7642]: I0319 11:54:13.438164 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:13.438461 master-0 kubenswrapper[7642]: E0319 11:54:13.438391 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:13.438461 master-0 kubenswrapper[7642]: I0319 11:54:13.438436 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:13.438587 master-0 kubenswrapper[7642]: E0319 11:54:13.438473 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:17.438447206 +0000 UTC m=+15.555887993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : secret "serving-cert" not found Mar 19 11:54:13.438587 master-0 kubenswrapper[7642]: E0319 11:54:13.438532 7642 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:13.438587 master-0 kubenswrapper[7642]: E0319 11:54:13.438581 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:17.43857055 +0000 UTC m=+15.556011337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : configmap "client-ca" not found Mar 19 11:54:13.726167 master-0 kubenswrapper[7642]: I0319 11:54:13.725947 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js"] Mar 19 11:54:13.726536 master-0 kubenswrapper[7642]: I0319 11:54:13.726480 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.734154 master-0 kubenswrapper[7642]: I0319 11:54:13.731263 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:13.734154 master-0 kubenswrapper[7642]: I0319 11:54:13.731469 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:54:13.734154 master-0 kubenswrapper[7642]: I0319 11:54:13.732404 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:54:13.734154 master-0 kubenswrapper[7642]: I0319 11:54:13.732513 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:54:13.734154 master-0 kubenswrapper[7642]: I0319 11:54:13.732586 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:13.746161 master-0 kubenswrapper[7642]: I0319 11:54:13.745214 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:54:13.758990 master-0 kubenswrapper[7642]: I0319 11:54:13.758915 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5576c989bb-kw2h6"] Mar 19 11:54:13.760052 master-0 kubenswrapper[7642]: I0319 11:54:13.759995 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js"] Mar 19 11:54:13.776983 master-0 kubenswrapper[7642]: I0319 11:54:13.776911 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5576c989bb-kw2h6"] Mar 19 11:54:13.844249 master-0 kubenswrapper[7642]: I0319 11:54:13.844149 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.844463 master-0 kubenswrapper[7642]: I0319 11:54:13.844275 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.844463 master-0 kubenswrapper[7642]: I0319 11:54:13.844324 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h7dc\" (UniqueName: \"kubernetes.io/projected/ee60c527-0fff-43d5-825d-230971422b38-kube-api-access-8h7dc\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.844463 master-0 kubenswrapper[7642]: I0319 11:54:13.844362 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-config\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.844463 master-0 kubenswrapper[7642]: I0319 11:54:13.844388 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-proxy-ca-bundles\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.844600 master-0 kubenswrapper[7642]: I0319 11:54:13.844475 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:13.844600 master-0 kubenswrapper[7642]: I0319 11:54:13.844490 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5abcb72-3bf1-4263-9cf1-69d2c19a575c-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:13.946249 master-0 kubenswrapper[7642]: I0319 11:54:13.946181 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.946758 master-0 kubenswrapper[7642]: E0319 11:54:13.946371 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:13.946869 master-0 kubenswrapper[7642]: E0319 11:54:13.946841 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:14.446811642 +0000 UTC m=+12.564252429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : configmap "client-ca" not found Mar 19 11:54:13.947039 master-0 kubenswrapper[7642]: I0319 11:54:13.947005 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.947391 master-0 kubenswrapper[7642]: E0319 11:54:13.947333 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:13.947477 master-0 kubenswrapper[7642]: E0319 11:54:13.947466 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:14.447431099 +0000 UTC m=+12.564872086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : secret "serving-cert" not found Mar 19 11:54:13.947541 master-0 kubenswrapper[7642]: I0319 11:54:13.947338 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h7dc\" (UniqueName: \"kubernetes.io/projected/ee60c527-0fff-43d5-825d-230971422b38-kube-api-access-8h7dc\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.947851 master-0 kubenswrapper[7642]: I0319 11:54:13.947807 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-config\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.947932 master-0 kubenswrapper[7642]: I0319 11:54:13.947877 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-proxy-ca-bundles\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.949381 master-0 kubenswrapper[7642]: I0319 11:54:13.949312 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-proxy-ca-bundles\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.949800 master-0 kubenswrapper[7642]: I0319 11:54:13.949751 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-config\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:13.967224 master-0 kubenswrapper[7642]: I0319 11:54:13.967136 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h7dc\" (UniqueName: \"kubernetes.io/projected/ee60c527-0fff-43d5-825d-230971422b38-kube-api-access-8h7dc\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:14.074759 master-0 kubenswrapper[7642]: I0319 11:54:14.074568 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5abcb72-3bf1-4263-9cf1-69d2c19a575c" path="/var/lib/kubelet/pods/b5abcb72-3bf1-4263-9cf1-69d2c19a575c/volumes" Mar 19 11:54:14.075486 master-0 kubenswrapper[7642]: I0319 11:54:14.075007 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:14.080655 master-0 kubenswrapper[7642]: I0319 11:54:14.080569 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:54:14.455621 master-0 kubenswrapper[7642]: I0319 11:54:14.455414 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:14.456137 master-0 kubenswrapper[7642]: E0319 11:54:14.455727 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:14.456137 master-0 kubenswrapper[7642]: E0319 11:54:14.455875 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:15.455840787 +0000 UTC m=+13.573281624 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : configmap "client-ca" not found Mar 19 11:54:14.456137 master-0 kubenswrapper[7642]: I0319 11:54:14.455977 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:14.456381 master-0 kubenswrapper[7642]: E0319 11:54:14.456250 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:14.456381 master-0 kubenswrapper[7642]: E0319 11:54:14.456338 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:15.45630877 +0000 UTC m=+13.573749597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : secret "serving-cert" not found Mar 19 11:54:15.471475 master-0 kubenswrapper[7642]: I0319 11:54:15.471353 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:15.472532 master-0 kubenswrapper[7642]: E0319 11:54:15.471606 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:15.472532 master-0 kubenswrapper[7642]: E0319 11:54:15.471750 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:17.471718676 +0000 UTC m=+15.589159613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : configmap "client-ca" not found Mar 19 11:54:15.472532 master-0 kubenswrapper[7642]: I0319 11:54:15.471896 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:15.472532 master-0 kubenswrapper[7642]: E0319 11:54:15.472261 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:15.472532 master-0 kubenswrapper[7642]: E0319 11:54:15.472301 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:17.472290682 +0000 UTC m=+15.589731639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : secret "serving-cert" not found Mar 19 11:54:17.497612 master-0 kubenswrapper[7642]: I0319 11:54:17.497499 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:17.498614 master-0 kubenswrapper[7642]: E0319 11:54:17.497776 7642 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:17.498614 master-0 kubenswrapper[7642]: E0319 11:54:17.497966 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:25.497891077 +0000 UTC m=+23.615331904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : configmap "client-ca" not found Mar 19 11:54:17.498946 master-0 kubenswrapper[7642]: I0319 11:54:17.498626 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:17.498946 master-0 kubenswrapper[7642]: I0319 11:54:17.498910 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:17.499091 master-0 kubenswrapper[7642]: I0319 11:54:17.499018 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:17.499274 master-0 kubenswrapper[7642]: E0319 11:54:17.499225 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:17.499355 master-0 kubenswrapper[7642]: E0319 11:54:17.499293 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:21.499272176 +0000 UTC m=+19.616713003 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : configmap "client-ca" not found Mar 19 11:54:17.499425 master-0 kubenswrapper[7642]: E0319 11:54:17.499408 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:17.499518 master-0 kubenswrapper[7642]: E0319 11:54:17.499448 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:21.499435571 +0000 UTC m=+19.616876398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : secret "serving-cert" not found Mar 19 11:54:17.499518 master-0 kubenswrapper[7642]: E0319 11:54:17.499517 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:17.499694 master-0 kubenswrapper[7642]: E0319 11:54:17.499555 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:25.499543744 +0000 UTC m=+23.616984571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : secret "serving-cert" not found Mar 19 11:54:18.262562 master-0 kubenswrapper[7642]: I0319 11:54:18.262275 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"b9318bc6665c43cc333e2980698d6c509e0e61367154c5acfc00fbbe7da6a9de"} Mar 19 11:54:18.572459 master-0 kubenswrapper[7642]: I0319 11:54:18.572326 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-rrcj6"] Mar 19 11:54:18.573367 master-0 kubenswrapper[7642]: I0319 11:54:18.573341 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.577143 master-0 kubenswrapper[7642]: I0319 11:54:18.577119 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 11:54:18.577586 master-0 kubenswrapper[7642]: I0319 11:54:18.577571 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 19 11:54:18.577743 master-0 kubenswrapper[7642]: I0319 11:54:18.577721 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 11:54:18.578019 master-0 kubenswrapper[7642]: I0319 11:54:18.577999 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 11:54:18.578207 master-0 kubenswrapper[7642]: I0319 11:54:18.578026 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 11:54:18.578314 master-0 kubenswrapper[7642]: I0319 11:54:18.578300 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 11:54:18.578578 master-0 kubenswrapper[7642]: I0319 11:54:18.578566 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 19 11:54:18.579758 master-0 kubenswrapper[7642]: I0319 11:54:18.579740 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 11:54:18.583713 master-0 kubenswrapper[7642]: I0319 11:54:18.583698 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 11:54:18.587724 master-0 kubenswrapper[7642]: I0319 11:54:18.587705 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 11:54:18.591693 master-0 kubenswrapper[7642]: I0319 11:54:18.591667 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-rrcj6"] Mar 19 11:54:18.717909 master-0 kubenswrapper[7642]: I0319 11:54:18.717807 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-encryption-config\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.718362 master-0 kubenswrapper[7642]: I0319 11:54:18.718331 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.718606 master-0 kubenswrapper[7642]: I0319 11:54:18.718580 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxr5\" (UniqueName: \"kubernetes.io/projected/6317d697-61f6-4383-9362-7bd1404e19f3-kube-api-access-fnxr5\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.718810 master-0 kubenswrapper[7642]: I0319 11:54:18.718789 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-node-pullsecrets\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.718966 master-0 kubenswrapper[7642]: I0319 11:54:18.718913 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.719027 master-0 kubenswrapper[7642]: I0319 11:54:18.718974 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.719075 master-0 kubenswrapper[7642]: I0319 11:54:18.719067 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-config\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.719134 master-0 kubenswrapper[7642]: I0319 11:54:18.719111 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-image-import-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.719181 master-0 kubenswrapper[7642]: I0319 11:54:18.719171 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-trusted-ca-bundle\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.719219 master-0 kubenswrapper[7642]: I0319 11:54:18.719196 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-audit-dir\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.719264 master-0 kubenswrapper[7642]: I0319 11:54:18.719227 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.820733 master-0 kubenswrapper[7642]: I0319 11:54:18.820648 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnxr5\" (UniqueName: \"kubernetes.io/projected/6317d697-61f6-4383-9362-7bd1404e19f3-kube-api-access-fnxr5\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.820733 master-0 kubenswrapper[7642]: I0319 11:54:18.820713 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.820733 master-0 kubenswrapper[7642]: I0319 11:54:18.820747 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-node-pullsecrets\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.820733 master-0 kubenswrapper[7642]: I0319 11:54:18.820768 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.820851 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-config\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.820877 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-image-import-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.820919 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-trusted-ca-bundle\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.820947 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-audit-dir\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.820968 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.820988 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.821014 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-encryption-config\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.821034 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.821174 master-0 kubenswrapper[7642]: I0319 11:54:18.821057 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:18.821412 master-0 kubenswrapper[7642]: E0319 11:54:18.821229 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 19 11:54:18.821412 master-0 kubenswrapper[7642]: E0319 11:54:18.821288 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert podName:6e8c4676-05af-4478-a44b-1bdcbefdea0f nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.821273123 +0000 UTC m=+32.938713910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert") pod "catalog-operator-68f85b4d6c-qldxf" (UID: "6e8c4676-05af-4478-a44b-1bdcbefdea0f") : secret "catalog-operator-serving-cert" not found Mar 19 11:54:18.821785 master-0 kubenswrapper[7642]: E0319 11:54:18.821746 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:18.821926 master-0 kubenswrapper[7642]: I0319 11:54:18.821887 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-audit-dir\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.822030 master-0 kubenswrapper[7642]: E0319 11:54:18.822017 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:19.321991893 +0000 UTC m=+17.439432680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "etcd-client" not found Mar 19 11:54:18.822099 master-0 kubenswrapper[7642]: E0319 11:54:18.822017 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 19 11:54:18.822201 master-0 kubenswrapper[7642]: E0319 11:54:18.822186 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert podName:5b46c6b4-f701-4a7f-bf74-90afe14ad89a nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.822173378 +0000 UTC m=+32.939614345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert") pod "olm-operator-5c9796789-jsp5b" (UID: "5b46c6b4-f701-4a7f-bf74-90afe14ad89a") : secret "olm-operator-serving-cert" not found Mar 19 11:54:18.822288 master-0 kubenswrapper[7642]: E0319 11:54:18.822255 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:18.822984 master-0 kubenswrapper[7642]: E0319 11:54:18.822413 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:19.322400925 +0000 UTC m=+17.439841892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "etcd-serving-ca" not found Mar 19 11:54:18.823092 master-0 kubenswrapper[7642]: E0319 11:54:18.822124 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:54:18.823221 master-0 kubenswrapper[7642]: E0319 11:54:18.823198 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:19.323188237 +0000 UTC m=+17.440629024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "audit-0" not found Mar 19 11:54:18.823332 master-0 kubenswrapper[7642]: E0319 11:54:18.822861 7642 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:54:18.823427 master-0 kubenswrapper[7642]: E0319 11:54:18.823417 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:19.323409673 +0000 UTC m=+17.440850460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "serving-cert" not found Mar 19 11:54:18.823480 master-0 kubenswrapper[7642]: I0319 11:54:18.823218 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-trusted-ca-bundle\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.823566 master-0 kubenswrapper[7642]: I0319 11:54:18.822915 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-image-import-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.823672 master-0 kubenswrapper[7642]: I0319 11:54:18.822966 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-config\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.823760 master-0 kubenswrapper[7642]: I0319 11:54:18.822072 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-node-pullsecrets\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.827473 master-0 kubenswrapper[7642]: I0319 11:54:18.827425 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-encryption-config\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.841411 master-0 kubenswrapper[7642]: I0319 11:54:18.841308 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnxr5\" (UniqueName: \"kubernetes.io/projected/6317d697-61f6-4383-9362-7bd1404e19f3-kube-api-access-fnxr5\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:18.922366 master-0 kubenswrapper[7642]: I0319 11:54:18.922299 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:18.922744 master-0 kubenswrapper[7642]: I0319 11:54:18.922726 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:18.922905 master-0 kubenswrapper[7642]: E0319 11:54:18.922543 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 19 11:54:18.922998 master-0 kubenswrapper[7642]: E0319 11:54:18.922976 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.922940882 +0000 UTC m=+33.040381669 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "node-tuning-operator-tls" not found Mar 19 11:54:18.923168 master-0 kubenswrapper[7642]: E0319 11:54:18.923142 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:18.923305 master-0 kubenswrapper[7642]: E0319 11:54:18.923293 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.923271552 +0000 UTC m=+33.040712339 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-operator-tls" not found Mar 19 11:54:18.923369 master-0 kubenswrapper[7642]: E0319 11:54:18.922853 7642 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:18.923480 master-0 kubenswrapper[7642]: E0319 11:54:18.923464 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert podName:78d3b2df-3d81-413e-b039-80dc20111eae nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.923449337 +0000 UTC m=+33.040890124 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert") pod "cluster-node-tuning-operator-598fbc5f8f-vcxdb" (UID: "78d3b2df-3d81-413e-b039-80dc20111eae") : secret "performance-addon-operator-webhook-cert" not found Mar 19 11:54:18.923567 master-0 kubenswrapper[7642]: I0319 11:54:18.922866 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:18.923712 master-0 kubenswrapper[7642]: I0319 11:54:18.923698 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:18.923796 master-0 kubenswrapper[7642]: E0319 11:54:18.923771 7642 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:18.923852 master-0 kubenswrapper[7642]: E0319 11:54:18.923814 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls podName:f38df702-a256-4f83-90bb-0c0e477a2ce6 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.923803727 +0000 UTC m=+33.041244514 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls") pod "ingress-operator-66b84d69b-59ptg" (UID: "f38df702-a256-4f83-90bb-0c0e477a2ce6") : secret "metrics-tls" not found Mar 19 11:54:18.923928 master-0 kubenswrapper[7642]: I0319 11:54:18.923785 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:18.924037 master-0 kubenswrapper[7642]: I0319 11:54:18.924018 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:18.924161 master-0 kubenswrapper[7642]: I0319 11:54:18.924139 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:18.924260 master-0 kubenswrapper[7642]: E0319 11:54:18.923957 7642 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 19 11:54:18.924308 master-0 kubenswrapper[7642]: E0319 11:54:18.924273 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs podName:5ae2d717-506b-4d8e-803b-689c1cc3557c nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.924264279 +0000 UTC m=+33.041705066 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs") pod "multus-admission-controller-5dbbb8b86f-z7wgn" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c") : secret "multus-admission-controller-secret" not found Mar 19 11:54:18.924308 master-0 kubenswrapper[7642]: E0319 11:54:18.924109 7642 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:18.924308 master-0 kubenswrapper[7642]: E0319 11:54:18.924311 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert podName:e90c10f6-36d9-44bd-a4c6-174f12135434 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.924305481 +0000 UTC m=+33.041746268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert") pod "cluster-baremetal-operator-6f69995874-cwncj" (UID: "e90c10f6-36d9-44bd-a4c6-174f12135434") : secret "cluster-baremetal-webhook-server-cert" not found Mar 19 11:54:18.924450 master-0 kubenswrapper[7642]: E0319 11:54:18.924219 7642 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 19 11:54:18.924450 master-0 kubenswrapper[7642]: E0319 11:54:18.924335 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs podName:3e1b71a7-c9fd-4f03-8d0c-f092dac80989 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.924330511 +0000 UTC m=+33.041771298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs") pod "network-metrics-daemon-nkd77" (UID: "3e1b71a7-c9fd-4f03-8d0c-f092dac80989") : secret "metrics-daemon-secret" not found Mar 19 11:54:18.924450 master-0 kubenswrapper[7642]: E0319 11:54:18.924452 7642 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:18.924558 master-0 kubenswrapper[7642]: E0319 11:54:18.924475 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert podName:60f846d9-1b3d-4b9d-b3e0-90a238e02640 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.924468895 +0000 UTC m=+33.041909682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert") pod "cluster-version-operator-56d8475767-nvbgc" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640") : secret "cluster-version-operator-serving-cert" not found Mar 19 11:54:18.924619 master-0 kubenswrapper[7642]: I0319 11:54:18.924246 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:18.924737 master-0 kubenswrapper[7642]: I0319 11:54:18.924722 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:18.924818 master-0 kubenswrapper[7642]: I0319 11:54:18.924806 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:18.924898 master-0 kubenswrapper[7642]: E0319 11:54:18.924834 7642 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 19 11:54:18.924938 master-0 kubenswrapper[7642]: E0319 11:54:18.924910 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics podName:39f479b3-45ff-43f9-ae85-083554a54918 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.924903487 +0000 UTC m=+33.042344274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics") pod "marketplace-operator-89ccd998f-4h6xn" (UID: "39f479b3-45ff-43f9-ae85-083554a54918") : secret "marketplace-operator-metrics" not found Mar 19 11:54:18.924938 master-0 kubenswrapper[7642]: I0319 11:54:18.924881 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:18.925027 master-0 kubenswrapper[7642]: I0319 11:54:18.925000 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:18.925097 master-0 kubenswrapper[7642]: I0319 11:54:18.925075 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:18.925178 master-0 kubenswrapper[7642]: E0319 11:54:18.925164 7642 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 19 11:54:18.925296 master-0 kubenswrapper[7642]: E0319 11:54:18.925278 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls podName:33f6e54f-a750-4c91-b4f5-049275d33cfb nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.925260027 +0000 UTC m=+33.042701004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls") pod "dns-operator-9c5679d8f-m5dt4" (UID: "33f6e54f-a750-4c91-b4f5-049275d33cfb") : secret "metrics-tls" not found Mar 19 11:54:18.925391 master-0 kubenswrapper[7642]: E0319 11:54:18.925198 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:18.925513 master-0 kubenswrapper[7642]: E0319 11:54:18.925499 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.925490034 +0000 UTC m=+33.042930821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:18.925594 master-0 kubenswrapper[7642]: E0319 11:54:18.924869 7642 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:18.925692 master-0 kubenswrapper[7642]: E0319 11:54:18.925680 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls podName:131e2573-ef74-4963-bdcf-901310e7a486 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.925671609 +0000 UTC m=+33.043112396 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-58845fbb57-j5rwg" (UID: "131e2573-ef74-4963-bdcf-901310e7a486") : secret "cluster-monitoring-operator-tls" not found Mar 19 11:54:18.925773 master-0 kubenswrapper[7642]: E0319 11:54:18.925228 7642 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 19 11:54:18.925899 master-0 kubenswrapper[7642]: E0319 11:54:18.925883 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls podName:07804be1-d37c-474e-a179-01ac541d9705 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.925870484 +0000 UTC m=+33.043311481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls") pod "cluster-image-registry-operator-5549dc66cb-5twg4" (UID: "07804be1-d37c-474e-a179-01ac541d9705") : secret "image-registry-operator-tls" not found Mar 19 11:54:19.332758 master-0 kubenswrapper[7642]: I0319 11:54:19.332564 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:19.333020 master-0 kubenswrapper[7642]: E0319 11:54:19.332814 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:54:19.333020 master-0 kubenswrapper[7642]: E0319 11:54:19.332961 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:20.332933922 +0000 UTC m=+18.450374889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "audit-0" not found Mar 19 11:54:19.333365 master-0 kubenswrapper[7642]: E0319 11:54:19.333319 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:19.333464 master-0 kubenswrapper[7642]: E0319 11:54:19.333437 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:20.333409035 +0000 UTC m=+18.450849852 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "etcd-serving-ca" not found Mar 19 11:54:19.333520 master-0 kubenswrapper[7642]: I0319 11:54:19.333174 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:19.333560 master-0 kubenswrapper[7642]: I0319 11:54:19.333530 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:19.333728 master-0 kubenswrapper[7642]: E0319 11:54:19.333699 7642 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:54:19.333884 master-0 kubenswrapper[7642]: E0319 11:54:19.333858 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:20.333823697 +0000 UTC m=+18.451264604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "serving-cert" not found Mar 19 11:54:19.333979 master-0 kubenswrapper[7642]: I0319 11:54:19.333957 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:19.334126 master-0 kubenswrapper[7642]: E0319 11:54:19.334093 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:19.334188 master-0 kubenswrapper[7642]: E0319 11:54:19.334161 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:20.334146616 +0000 UTC m=+18.451587443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "etcd-client" not found Mar 19 11:54:20.274274 master-0 kubenswrapper[7642]: I0319 11:54:20.272543 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerStarted","Data":"8cf7c92799c042639046f9139208de62b93f5c3c4be8bc7881686ff116d251d3"} Mar 19 11:54:20.274274 master-0 kubenswrapper[7642]: I0319 11:54:20.274004 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:20.351266 master-0 kubenswrapper[7642]: I0319 11:54:20.351117 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:20.351266 master-0 kubenswrapper[7642]: I0319 11:54:20.351206 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:20.351510 master-0 kubenswrapper[7642]: I0319 11:54:20.351284 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:20.351510 master-0 kubenswrapper[7642]: I0319 11:54:20.351321 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:20.351510 master-0 kubenswrapper[7642]: E0319 11:54:20.351498 7642 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:54:20.351602 master-0 kubenswrapper[7642]: E0319 11:54:20.351557 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:22.351538126 +0000 UTC m=+20.468978913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "serving-cert" not found Mar 19 11:54:20.352046 master-0 kubenswrapper[7642]: E0319 11:54:20.352021 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:20.352122 master-0 kubenswrapper[7642]: E0319 11:54:20.352054 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:22.352046231 +0000 UTC m=+20.469487008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "etcd-client" not found Mar 19 11:54:20.352122 master-0 kubenswrapper[7642]: E0319 11:54:20.352090 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:54:20.352122 master-0 kubenswrapper[7642]: E0319 11:54:20.352113 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:22.352105852 +0000 UTC m=+20.469546639 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "audit-0" not found Mar 19 11:54:20.352237 master-0 kubenswrapper[7642]: E0319 11:54:20.352147 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:20.352237 master-0 kubenswrapper[7642]: E0319 11:54:20.352173 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:22.352165204 +0000 UTC m=+20.469605991 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "etcd-serving-ca" not found Mar 19 11:54:20.907051 master-0 kubenswrapper[7642]: I0319 11:54:20.906981 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-pdjtv"] Mar 19 11:54:20.907605 master-0 kubenswrapper[7642]: I0319 11:54:20.907575 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:20.909904 master-0 kubenswrapper[7642]: I0319 11:54:20.909853 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 11:54:20.910288 master-0 kubenswrapper[7642]: I0319 11:54:20.910226 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 11:54:20.910552 master-0 kubenswrapper[7642]: I0319 11:54:20.910522 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 11:54:20.910787 master-0 kubenswrapper[7642]: I0319 11:54:20.910759 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 11:54:20.917069 master-0 kubenswrapper[7642]: I0319 11:54:20.917001 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-pdjtv"] Mar 19 11:54:21.061251 master-0 kubenswrapper[7642]: I0319 11:54:21.061177 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-cabundle\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.061588 master-0 kubenswrapper[7642]: I0319 11:54:21.061274 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-key\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.061588 master-0 kubenswrapper[7642]: I0319 11:54:21.061430 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4cz\" (UniqueName: \"kubernetes.io/projected/4e7b04dc-e85d-41b8-959c-87b1b24731a1-kube-api-access-9h4cz\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.163024 master-0 kubenswrapper[7642]: I0319 11:54:21.162836 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4cz\" (UniqueName: \"kubernetes.io/projected/4e7b04dc-e85d-41b8-959c-87b1b24731a1-kube-api-access-9h4cz\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.164204 master-0 kubenswrapper[7642]: I0319 11:54:21.164156 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-key\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.164204 master-0 kubenswrapper[7642]: I0319 11:54:21.164192 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-cabundle\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.167024 master-0 kubenswrapper[7642]: I0319 11:54:21.166488 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-cabundle\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.177887 master-0 kubenswrapper[7642]: I0319 11:54:21.177784 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-key\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.187924 master-0 kubenswrapper[7642]: I0319 11:54:21.187858 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4cz\" (UniqueName: \"kubernetes.io/projected/4e7b04dc-e85d-41b8-959c-87b1b24731a1-kube-api-access-9h4cz\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.237802 master-0 kubenswrapper[7642]: I0319 11:54:21.237696 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 11:54:21.279358 master-0 kubenswrapper[7642]: I0319 11:54:21.279290 7642 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="47183b768a7e3ff5da057dffa4e751f7869a36ae402360723f1788dfd4f7f643" exitCode=0 Mar 19 11:54:21.280390 master-0 kubenswrapper[7642]: I0319 11:54:21.279397 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerDied","Data":"47183b768a7e3ff5da057dffa4e751f7869a36ae402360723f1788dfd4f7f643"} Mar 19 11:54:21.497789 master-0 kubenswrapper[7642]: I0319 11:54:21.497293 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-79bc6b8d76-pdjtv"] Mar 19 11:54:21.503687 master-0 kubenswrapper[7642]: W0319 11:54:21.503625 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7b04dc_e85d_41b8_959c_87b1b24731a1.slice/crio-3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba WatchSource:0}: Error finding container 3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba: Status 404 returned error can't find the container with id 3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba Mar 19 11:54:21.570736 master-0 kubenswrapper[7642]: I0319 11:54:21.570682 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:21.570916 master-0 kubenswrapper[7642]: I0319 11:54:21.570772 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:21.570966 master-0 kubenswrapper[7642]: E0319 11:54:21.570930 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:21.571068 master-0 kubenswrapper[7642]: E0319 11:54:21.571042 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:29.57101184 +0000 UTC m=+27.688452827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : configmap "client-ca" not found Mar 19 11:54:21.571267 master-0 kubenswrapper[7642]: E0319 11:54:21.571186 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:21.571322 master-0 kubenswrapper[7642]: E0319 11:54:21.571303 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:29.571279047 +0000 UTC m=+27.688719834 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : secret "serving-cert" not found Mar 19 11:54:22.293672 master-0 kubenswrapper[7642]: I0319 11:54:22.293531 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerStarted","Data":"7ed370adba82a4b171cad234b3d380f3d393808daf0a0a8fd7c7e46ac33f3ee6"} Mar 19 11:54:22.310730 master-0 kubenswrapper[7642]: I0319 11:54:22.309823 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" event={"ID":"4e7b04dc-e85d-41b8-959c-87b1b24731a1","Type":"ContainerStarted","Data":"b0d21e5f0486b201b3c06d890cda932f5d618ab09a01f42f7cc5e1ab3e144300"} Mar 19 11:54:22.310730 master-0 kubenswrapper[7642]: I0319 11:54:22.309881 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" event={"ID":"4e7b04dc-e85d-41b8-959c-87b1b24731a1","Type":"ContainerStarted","Data":"3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba"} Mar 19 11:54:22.388532 master-0 kubenswrapper[7642]: I0319 11:54:22.387980 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:22.389164 master-0 kubenswrapper[7642]: I0319 11:54:22.389143 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:22.389476 master-0 kubenswrapper[7642]: I0319 11:54:22.389458 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:22.389589 master-0 kubenswrapper[7642]: I0319 11:54:22.389574 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert\") pod \"apiserver-c765cd67b-rrcj6\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:22.390714 master-0 kubenswrapper[7642]: E0319 11:54:22.388336 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:22.390801 master-0 kubenswrapper[7642]: E0319 11:54:22.390782 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:26.390757002 +0000 UTC m=+24.508197789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "etcd-client" not found Mar 19 11:54:22.391773 master-0 kubenswrapper[7642]: E0319 11:54:22.391278 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 19 11:54:22.391773 master-0 kubenswrapper[7642]: E0319 11:54:22.391312 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:26.391301977 +0000 UTC m=+24.508742764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "audit-0" not found Mar 19 11:54:22.391773 master-0 kubenswrapper[7642]: E0319 11:54:22.391483 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:22.391773 master-0 kubenswrapper[7642]: E0319 11:54:22.391509 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:26.391500862 +0000 UTC m=+24.508941649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : configmap "etcd-serving-ca" not found Mar 19 11:54:22.391773 master-0 kubenswrapper[7642]: E0319 11:54:22.391565 7642 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:54:22.391773 master-0 kubenswrapper[7642]: E0319 11:54:22.391589 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert podName:6317d697-61f6-4383-9362-7bd1404e19f3 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:26.391581235 +0000 UTC m=+24.509022022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert") pod "apiserver-c765cd67b-rrcj6" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3") : secret "serving-cert" not found Mar 19 11:54:23.995378 master-0 kubenswrapper[7642]: I0319 11:54:23.995275 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" podStartSLOduration=3.995251307 podStartE2EDuration="3.995251307s" podCreationTimestamp="2026-03-19 11:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:22.359112098 +0000 UTC m=+20.476552885" watchObservedRunningTime="2026-03-19 11:54:23.995251307 +0000 UTC m=+22.112692084" Mar 19 11:54:23.996837 master-0 kubenswrapper[7642]: I0319 11:54:23.995993 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-rrcj6"] Mar 19 11:54:23.996837 master-0 kubenswrapper[7642]: E0319 11:54:23.996372 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit etcd-client etcd-serving-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" podUID="6317d697-61f6-4383-9362-7bd1404e19f3" Mar 19 11:54:24.322018 master-0 kubenswrapper[7642]: I0319 11:54:24.321859 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:24.335424 master-0 kubenswrapper[7642]: I0319 11:54:24.335354 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:24.525773 master-0 kubenswrapper[7642]: I0319 11:54:24.525698 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-audit-dir\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.525773 master-0 kubenswrapper[7642]: I0319 11:54:24.525775 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-image-import-ca\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.525773 master-0 kubenswrapper[7642]: I0319 11:54:24.525796 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-node-pullsecrets\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.526171 master-0 kubenswrapper[7642]: I0319 11:54:24.525818 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnxr5\" (UniqueName: \"kubernetes.io/projected/6317d697-61f6-4383-9362-7bd1404e19f3-kube-api-access-fnxr5\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.526171 master-0 kubenswrapper[7642]: I0319 11:54:24.525847 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-encryption-config\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.526171 master-0 kubenswrapper[7642]: I0319 11:54:24.525871 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-trusted-ca-bundle\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.526171 master-0 kubenswrapper[7642]: I0319 11:54:24.525890 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-config\") pod \"6317d697-61f6-4383-9362-7bd1404e19f3\" (UID: \"6317d697-61f6-4383-9362-7bd1404e19f3\") " Mar 19 11:54:24.526660 master-0 kubenswrapper[7642]: I0319 11:54:24.526597 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:24.527703 master-0 kubenswrapper[7642]: I0319 11:54:24.526987 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:24.528550 master-0 kubenswrapper[7642]: I0319 11:54:24.527404 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:24.528604 master-0 kubenswrapper[7642]: I0319 11:54:24.527931 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:24.528772 master-0 kubenswrapper[7642]: I0319 11:54:24.528747 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-config" (OuterVolumeSpecName: "config") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:24.534028 master-0 kubenswrapper[7642]: I0319 11:54:24.533968 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6317d697-61f6-4383-9362-7bd1404e19f3-kube-api-access-fnxr5" (OuterVolumeSpecName: "kube-api-access-fnxr5") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "kube-api-access-fnxr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:24.536610 master-0 kubenswrapper[7642]: I0319 11:54:24.536196 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "6317d697-61f6-4383-9362-7bd1404e19f3" (UID: "6317d697-61f6-4383-9362-7bd1404e19f3"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:24.627310 master-0 kubenswrapper[7642]: I0319 11:54:24.627238 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.627310 master-0 kubenswrapper[7642]: I0319 11:54:24.627284 7642 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.627310 master-0 kubenswrapper[7642]: I0319 11:54:24.627295 7642 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.627310 master-0 kubenswrapper[7642]: I0319 11:54:24.627305 7642 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6317d697-61f6-4383-9362-7bd1404e19f3-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.627310 master-0 kubenswrapper[7642]: I0319 11:54:24.627314 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnxr5\" (UniqueName: \"kubernetes.io/projected/6317d697-61f6-4383-9362-7bd1404e19f3-kube-api-access-fnxr5\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.627310 master-0 kubenswrapper[7642]: I0319 11:54:24.627326 7642 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.627716 master-0 kubenswrapper[7642]: I0319 11:54:24.627338 7642 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:24.877502 master-0 kubenswrapper[7642]: I0319 11:54:24.877380 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:25.330443 master-0 kubenswrapper[7642]: I0319 11:54:25.329937 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerStarted","Data":"63b90693b783d2344a0175569017503f87f560c1b75623dba968fbe7562bf39a"} Mar 19 11:54:25.335019 master-0 kubenswrapper[7642]: I0319 11:54:25.334964 7642 generic.go:334] "Generic (PLEG): container finished" podID="4705c8f3-89d4-42f2-ad14-e591c5380b9e" containerID="8cf7c92799c042639046f9139208de62b93f5c3c4be8bc7881686ff116d251d3" exitCode=0 Mar 19 11:54:25.335212 master-0 kubenswrapper[7642]: I0319 11:54:25.335085 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-c765cd67b-rrcj6" Mar 19 11:54:25.338225 master-0 kubenswrapper[7642]: I0319 11:54:25.338176 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerDied","Data":"8cf7c92799c042639046f9139208de62b93f5c3c4be8bc7881686ff116d251d3"} Mar 19 11:54:25.338592 master-0 kubenswrapper[7642]: I0319 11:54:25.338460 7642 scope.go:117] "RemoveContainer" containerID="8cf7c92799c042639046f9139208de62b93f5c3c4be8bc7881686ff116d251d3" Mar 19 11:54:25.472906 master-0 kubenswrapper[7642]: I0319 11:54:25.470916 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-ff96cdc79-z9zjw"] Mar 19 11:54:25.472906 master-0 kubenswrapper[7642]: I0319 11:54:25.471698 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-rrcj6"] Mar 19 11:54:25.472906 master-0 kubenswrapper[7642]: I0319 11:54:25.471801 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.478040 master-0 kubenswrapper[7642]: I0319 11:54:25.477661 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 11:54:25.478040 master-0 kubenswrapper[7642]: I0319 11:54:25.477910 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 11:54:25.478221 master-0 kubenswrapper[7642]: I0319 11:54:25.478056 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 11:54:25.478221 master-0 kubenswrapper[7642]: I0319 11:54:25.478179 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 11:54:25.478707 master-0 kubenswrapper[7642]: I0319 11:54:25.478332 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 11:54:25.478707 master-0 kubenswrapper[7642]: I0319 11:54:25.478536 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 11:54:25.478815 master-0 kubenswrapper[7642]: I0319 11:54:25.478726 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 11:54:25.479142 master-0 kubenswrapper[7642]: I0319 11:54:25.479070 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 11:54:25.479142 master-0 kubenswrapper[7642]: I0319 11:54:25.479081 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 11:54:25.483671 master-0 kubenswrapper[7642]: I0319 11:54:25.482219 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-ff96cdc79-z9zjw"] Mar 19 11:54:25.492251 master-0 kubenswrapper[7642]: I0319 11:54:25.492211 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 11:54:25.518102 master-0 kubenswrapper[7642]: I0319 11:54:25.515913 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-c765cd67b-rrcj6"] Mar 19 11:54:25.551177 master-0 kubenswrapper[7642]: I0319 11:54:25.550880 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-trusted-ca-bundle\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551177 master-0 kubenswrapper[7642]: I0319 11:54:25.550958 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551177 master-0 kubenswrapper[7642]: I0319 11:54:25.551033 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:25.551177 master-0 kubenswrapper[7642]: I0319 11:54:25.551104 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551177 master-0 kubenswrapper[7642]: I0319 11:54:25.551142 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdf5k\" (UniqueName: \"kubernetes.io/projected/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-kube-api-access-vdf5k\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551177 master-0 kubenswrapper[7642]: I0319 11:54:25.551177 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551201 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-config\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551227 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-serving-cert\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551252 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-image-import-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551278 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-encryption-config\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551313 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit-dir\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551370 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-node-pullsecrets\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551420 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") pod \"route-controller-manager-86458b9cf7-7kkbp\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551489 7642 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551507 7642 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551519 7642 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6317d697-61f6-4383-9362-7bd1404e19f3-audit\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:25.551592 master-0 kubenswrapper[7642]: I0319 11:54:25.551553 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6317d697-61f6-4383-9362-7bd1404e19f3-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:25.551910 master-0 kubenswrapper[7642]: E0319 11:54:25.551692 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:25.551910 master-0 kubenswrapper[7642]: E0319 11:54:25.551760 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:41.551738322 +0000 UTC m=+39.669179109 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : secret "serving-cert" not found Mar 19 11:54:25.551910 master-0 kubenswrapper[7642]: E0319 11:54:25.551834 7642 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 19 11:54:25.551910 master-0 kubenswrapper[7642]: E0319 11:54:25.551857 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca podName:9c3ceff4-e4f6-4149-b31b-bbce53d704be nodeName:}" failed. No retries permitted until 2026-03-19 11:54:41.551849855 +0000 UTC m=+39.669290642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca") pod "route-controller-manager-86458b9cf7-7kkbp" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be") : configmap "client-ca" not found Mar 19 11:54:25.589049 master-0 kubenswrapper[7642]: I0319 11:54:25.588960 7642 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:25.652508 master-0 kubenswrapper[7642]: I0319 11:54:25.652445 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-serving-cert\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.653553 master-0 kubenswrapper[7642]: I0319 11:54:25.653500 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-image-import-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.653605 master-0 kubenswrapper[7642]: I0319 11:54:25.653564 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-encryption-config\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.653654 master-0 kubenswrapper[7642]: I0319 11:54:25.653606 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit-dir\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.653719 master-0 kubenswrapper[7642]: I0319 11:54:25.653693 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-node-pullsecrets\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.653803 master-0 kubenswrapper[7642]: I0319 11:54:25.653772 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-node-pullsecrets\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.653860 master-0 kubenswrapper[7642]: I0319 11:54:25.653772 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit-dir\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654055 master-0 kubenswrapper[7642]: I0319 11:54:25.653997 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-trusted-ca-bundle\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654093 master-0 kubenswrapper[7642]: I0319 11:54:25.654072 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654217 master-0 kubenswrapper[7642]: I0319 11:54:25.654184 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654260 master-0 kubenswrapper[7642]: I0319 11:54:25.654230 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdf5k\" (UniqueName: \"kubernetes.io/projected/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-kube-api-access-vdf5k\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654293 master-0 kubenswrapper[7642]: I0319 11:54:25.654269 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654329 master-0 kubenswrapper[7642]: I0319 11:54:25.654295 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-config\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.654565 master-0 kubenswrapper[7642]: E0319 11:54:25.654533 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:25.654621 master-0 kubenswrapper[7642]: E0319 11:54:25.654605 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:26.154586044 +0000 UTC m=+24.272026831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : configmap "etcd-serving-ca" not found Mar 19 11:54:25.654857 master-0 kubenswrapper[7642]: E0319 11:54:25.654808 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:25.654963 master-0 kubenswrapper[7642]: E0319 11:54:25.654943 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:26.154904683 +0000 UTC m=+24.272345670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : secret "etcd-client" not found Mar 19 11:54:25.655263 master-0 kubenswrapper[7642]: I0319 11:54:25.655231 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-config\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.655576 master-0 kubenswrapper[7642]: I0319 11:54:25.655518 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.655804 master-0 kubenswrapper[7642]: I0319 11:54:25.655766 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-trusted-ca-bundle\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.656118 master-0 kubenswrapper[7642]: I0319 11:54:25.656079 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-image-import-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.657804 master-0 kubenswrapper[7642]: I0319 11:54:25.657765 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-serving-cert\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.658866 master-0 kubenswrapper[7642]: I0319 11:54:25.658830 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-encryption-config\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:25.854945 master-0 kubenswrapper[7642]: I0319 11:54:25.854793 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdf5k\" (UniqueName: \"kubernetes.io/projected/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-kube-api-access-vdf5k\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:26.072555 master-0 kubenswrapper[7642]: I0319 11:54:26.072456 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6317d697-61f6-4383-9362-7bd1404e19f3" path="/var/lib/kubelet/pods/6317d697-61f6-4383-9362-7bd1404e19f3/volumes" Mar 19 11:54:26.160274 master-0 kubenswrapper[7642]: I0319 11:54:26.160197 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:26.160520 master-0 kubenswrapper[7642]: E0319 11:54:26.160408 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:26.160520 master-0 kubenswrapper[7642]: E0319 11:54:26.160519 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:27.160495841 +0000 UTC m=+25.277936618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : configmap "etcd-serving-ca" not found Mar 19 11:54:26.160704 master-0 kubenswrapper[7642]: I0319 11:54:26.160671 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:26.160882 master-0 kubenswrapper[7642]: E0319 11:54:26.160857 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:26.160882 master-0 kubenswrapper[7642]: E0319 11:54:26.160892 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:27.160884952 +0000 UTC m=+25.278325739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : secret "etcd-client" not found Mar 19 11:54:26.341549 master-0 kubenswrapper[7642]: I0319 11:54:26.341490 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerStarted","Data":"6247fd87460d07ebeb5aa88a818055c6dd5c137e669ed6cad1d8a519f682618d"} Mar 19 11:54:26.342535 master-0 kubenswrapper[7642]: I0319 11:54:26.342510 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:26.856956 master-0 kubenswrapper[7642]: I0319 11:54:26.856409 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:54:26.857751 master-0 kubenswrapper[7642]: I0319 11:54:26.857716 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:26.864605 master-0 kubenswrapper[7642]: I0319 11:54:26.864571 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 11:54:26.867036 master-0 kubenswrapper[7642]: I0319 11:54:26.866950 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:54:26.969273 master-0 kubenswrapper[7642]: I0319 11:54:26.969202 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:26.969763 master-0 kubenswrapper[7642]: I0319 11:54:26.969731 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-var-lock\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:26.970031 master-0 kubenswrapper[7642]: I0319 11:54:26.970000 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6660a89a-cb33-4e20-aa58-93c4838e44c3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.071667 master-0 kubenswrapper[7642]: I0319 11:54:27.071586 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.071962 master-0 kubenswrapper[7642]: I0319 11:54:27.071692 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-var-lock\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.071962 master-0 kubenswrapper[7642]: I0319 11:54:27.071741 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6660a89a-cb33-4e20-aa58-93c4838e44c3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.071962 master-0 kubenswrapper[7642]: I0319 11:54:27.071933 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-var-lock\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.072285 master-0 kubenswrapper[7642]: I0319 11:54:27.072236 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.102041 master-0 kubenswrapper[7642]: I0319 11:54:27.101947 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6660a89a-cb33-4e20-aa58-93c4838e44c3-kube-api-access\") pod \"installer-1-master-0\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.173478 master-0 kubenswrapper[7642]: I0319 11:54:27.173383 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:27.173692 master-0 kubenswrapper[7642]: I0319 11:54:27.173543 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:27.173839 master-0 kubenswrapper[7642]: E0319 11:54:27.173806 7642 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 19 11:54:27.173889 master-0 kubenswrapper[7642]: E0319 11:54:27.173880 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:29.17386218 +0000 UTC m=+27.291302967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : configmap "etcd-serving-ca" not found Mar 19 11:54:27.173980 master-0 kubenswrapper[7642]: E0319 11:54:27.173940 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:27.174091 master-0 kubenswrapper[7642]: E0319 11:54:27.174064 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:29.174033805 +0000 UTC m=+27.291474692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : secret "etcd-client" not found Mar 19 11:54:27.184801 master-0 kubenswrapper[7642]: I0319 11:54:27.184734 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:54:27.401243 master-0 kubenswrapper[7642]: I0319 11:54:27.401153 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:54:27.703842 master-0 kubenswrapper[7642]: I0319 11:54:27.703317 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js"] Mar 19 11:54:27.704122 master-0 kubenswrapper[7642]: E0319 11:54:27.704098 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" podUID="ee60c527-0fff-43d5-825d-230971422b38" Mar 19 11:54:27.720500 master-0 kubenswrapper[7642]: I0319 11:54:27.720435 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp"] Mar 19 11:54:27.720828 master-0 kubenswrapper[7642]: E0319 11:54:27.720792 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" podUID="9c3ceff4-e4f6-4149-b31b-bbce53d704be" Mar 19 11:54:28.370016 master-0 kubenswrapper[7642]: I0319 11:54:28.369948 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:28.370335 master-0 kubenswrapper[7642]: I0319 11:54:28.370018 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:28.370335 master-0 kubenswrapper[7642]: I0319 11:54:28.369958 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"6660a89a-cb33-4e20-aa58-93c4838e44c3","Type":"ContainerStarted","Data":"5db8b9dc9974fa7fae69b5ed92758705aaeaec268deaa769d2d9d84f72d9569a"} Mar 19 11:54:28.370335 master-0 kubenswrapper[7642]: I0319 11:54:28.370267 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"6660a89a-cb33-4e20-aa58-93c4838e44c3","Type":"ContainerStarted","Data":"34c8be87db4beba6862175ed9b1b7317ca6d98647db2c96dcbb21d3477fb6608"} Mar 19 11:54:28.377610 master-0 kubenswrapper[7642]: I0319 11:54:28.377550 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 11:54:28.381139 master-0 kubenswrapper[7642]: I0319 11:54:28.381101 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:28.389121 master-0 kubenswrapper[7642]: I0319 11:54:28.389086 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:28.406783 master-0 kubenswrapper[7642]: I0319 11:54:28.406617 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=2.406587414 podStartE2EDuration="2.406587414s" podCreationTimestamp="2026-03-19 11:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:28.385416352 +0000 UTC m=+26.502857159" watchObservedRunningTime="2026-03-19 11:54:28.406587414 +0000 UTC m=+26.524028211" Mar 19 11:54:28.493028 master-0 kubenswrapper[7642]: I0319 11:54:28.492981 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h7dc\" (UniqueName: \"kubernetes.io/projected/ee60c527-0fff-43d5-825d-230971422b38-kube-api-access-8h7dc\") pod \"ee60c527-0fff-43d5-825d-230971422b38\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " Mar 19 11:54:28.493346 master-0 kubenswrapper[7642]: I0319 11:54:28.493332 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8224\" (UniqueName: \"kubernetes.io/projected/9c3ceff4-e4f6-4149-b31b-bbce53d704be-kube-api-access-m8224\") pod \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " Mar 19 11:54:28.493488 master-0 kubenswrapper[7642]: I0319 11:54:28.493473 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-proxy-ca-bundles\") pod \"ee60c527-0fff-43d5-825d-230971422b38\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " Mar 19 11:54:28.493606 master-0 kubenswrapper[7642]: I0319 11:54:28.493591 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-config\") pod \"ee60c527-0fff-43d5-825d-230971422b38\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " Mar 19 11:54:28.493755 master-0 kubenswrapper[7642]: I0319 11:54:28.493739 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-config\") pod \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\" (UID: \"9c3ceff4-e4f6-4149-b31b-bbce53d704be\") " Mar 19 11:54:28.494043 master-0 kubenswrapper[7642]: I0319 11:54:28.494010 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ee60c527-0fff-43d5-825d-230971422b38" (UID: "ee60c527-0fff-43d5-825d-230971422b38"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:28.494195 master-0 kubenswrapper[7642]: I0319 11:54:28.494119 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-config" (OuterVolumeSpecName: "config") pod "ee60c527-0fff-43d5-825d-230971422b38" (UID: "ee60c527-0fff-43d5-825d-230971422b38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:28.494195 master-0 kubenswrapper[7642]: I0319 11:54:28.494179 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-config" (OuterVolumeSpecName: "config") pod "9c3ceff4-e4f6-4149-b31b-bbce53d704be" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:28.494835 master-0 kubenswrapper[7642]: I0319 11:54:28.494808 7642 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:28.494835 master-0 kubenswrapper[7642]: I0319 11:54:28.494834 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:28.495021 master-0 kubenswrapper[7642]: I0319 11:54:28.494847 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:28.498480 master-0 kubenswrapper[7642]: I0319 11:54:28.497981 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3ceff4-e4f6-4149-b31b-bbce53d704be-kube-api-access-m8224" (OuterVolumeSpecName: "kube-api-access-m8224") pod "9c3ceff4-e4f6-4149-b31b-bbce53d704be" (UID: "9c3ceff4-e4f6-4149-b31b-bbce53d704be"). InnerVolumeSpecName "kube-api-access-m8224". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:28.498480 master-0 kubenswrapper[7642]: I0319 11:54:28.498210 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee60c527-0fff-43d5-825d-230971422b38-kube-api-access-8h7dc" (OuterVolumeSpecName: "kube-api-access-8h7dc") pod "ee60c527-0fff-43d5-825d-230971422b38" (UID: "ee60c527-0fff-43d5-825d-230971422b38"). InnerVolumeSpecName "kube-api-access-8h7dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:28.596494 master-0 kubenswrapper[7642]: I0319 11:54:28.596420 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8224\" (UniqueName: \"kubernetes.io/projected/9c3ceff4-e4f6-4149-b31b-bbce53d704be-kube-api-access-m8224\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:28.596494 master-0 kubenswrapper[7642]: I0319 11:54:28.596461 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h7dc\" (UniqueName: \"kubernetes.io/projected/ee60c527-0fff-43d5-825d-230971422b38-kube-api-access-8h7dc\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:29.202194 master-0 kubenswrapper[7642]: I0319 11:54:29.202094 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:29.202667 master-0 kubenswrapper[7642]: E0319 11:54:29.202317 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:29.202667 master-0 kubenswrapper[7642]: I0319 11:54:29.202435 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:29.202667 master-0 kubenswrapper[7642]: E0319 11:54:29.202449 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:33.202424818 +0000 UTC m=+31.319865605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : secret "etcd-client" not found Mar 19 11:54:29.203377 master-0 kubenswrapper[7642]: I0319 11:54:29.203329 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:29.374319 master-0 kubenswrapper[7642]: I0319 11:54:29.374244 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp" Mar 19 11:54:29.376935 master-0 kubenswrapper[7642]: I0319 11:54:29.376866 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:29.609337 master-0 kubenswrapper[7642]: I0319 11:54:29.609117 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:29.609337 master-0 kubenswrapper[7642]: E0319 11:54:29.609306 7642 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: object "openshift-controller-manager"/"client-ca" not registered Mar 19 11:54:29.610121 master-0 kubenswrapper[7642]: E0319 11:54:29.609423 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:45.609393533 +0000 UTC m=+43.726834500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : object "openshift-controller-manager"/"client-ca" not registered Mar 19 11:54:29.610121 master-0 kubenswrapper[7642]: I0319 11:54:29.609546 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") pod \"controller-manager-7fd69bc8f7-bd8js\" (UID: \"ee60c527-0fff-43d5-825d-230971422b38\") " pod="openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js" Mar 19 11:54:29.610121 master-0 kubenswrapper[7642]: E0319 11:54:29.609784 7642 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: object "openshift-controller-manager"/"serving-cert" not registered Mar 19 11:54:29.610121 master-0 kubenswrapper[7642]: E0319 11:54:29.609811 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert podName:ee60c527-0fff-43d5-825d-230971422b38 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:45.609804074 +0000 UTC m=+43.727244861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert") pod "controller-manager-7fd69bc8f7-bd8js" (UID: "ee60c527-0fff-43d5-825d-230971422b38") : object "openshift-controller-manager"/"serving-cert" not registered Mar 19 11:54:30.745369 master-0 kubenswrapper[7642]: I0319 11:54:30.745089 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69dc5c9779-942tz"] Mar 19 11:54:30.746161 master-0 kubenswrapper[7642]: I0319 11:54:30.745777 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.748748 master-0 kubenswrapper[7642]: I0319 11:54:30.747932 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:54:30.748748 master-0 kubenswrapper[7642]: I0319 11:54:30.748629 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:30.749124 master-0 kubenswrapper[7642]: I0319 11:54:30.748908 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:54:30.749416 master-0 kubenswrapper[7642]: I0319 11:54:30.749385 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:30.749793 master-0 kubenswrapper[7642]: I0319 11:54:30.749523 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:54:30.755092 master-0 kubenswrapper[7642]: I0319 11:54:30.755048 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:54:30.827914 master-0 kubenswrapper[7642]: I0319 11:54:30.827825 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-proxy-ca-bundles\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.827914 master-0 kubenswrapper[7642]: I0319 11:54:30.827907 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8349ad-5063-439f-9785-fa90df81836b-serving-cert\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.828223 master-0 kubenswrapper[7642]: I0319 11:54:30.828045 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-config\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.828389 master-0 kubenswrapper[7642]: I0319 11:54:30.828353 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-client-ca\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.828435 master-0 kubenswrapper[7642]: I0319 11:54:30.828390 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b48j\" (UniqueName: \"kubernetes.io/projected/ac8349ad-5063-439f-9785-fa90df81836b-kube-api-access-6b48j\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.929726 master-0 kubenswrapper[7642]: I0319 11:54:30.929645 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-client-ca\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.929726 master-0 kubenswrapper[7642]: I0319 11:54:30.929704 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b48j\" (UniqueName: \"kubernetes.io/projected/ac8349ad-5063-439f-9785-fa90df81836b-kube-api-access-6b48j\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.930766 master-0 kubenswrapper[7642]: I0319 11:54:30.930100 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-proxy-ca-bundles\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.930766 master-0 kubenswrapper[7642]: I0319 11:54:30.930186 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8349ad-5063-439f-9785-fa90df81836b-serving-cert\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.930766 master-0 kubenswrapper[7642]: I0319 11:54:30.930228 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-config\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.931394 master-0 kubenswrapper[7642]: I0319 11:54:30.931367 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-client-ca\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.932374 master-0 kubenswrapper[7642]: I0319 11:54:30.932327 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-config\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.933523 master-0 kubenswrapper[7642]: I0319 11:54:30.933471 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-proxy-ca-bundles\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:30.937287 master-0 kubenswrapper[7642]: I0319 11:54:30.937254 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8349ad-5063-439f-9785-fa90df81836b-serving-cert\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:31.109611 master-0 kubenswrapper[7642]: I0319 11:54:31.109465 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69dc5c9779-942tz"] Mar 19 11:54:31.168735 master-0 kubenswrapper[7642]: I0319 11:54:31.168660 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js"] Mar 19 11:54:31.384833 master-0 kubenswrapper[7642]: I0319 11:54:31.384610 7642 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="63b90693b783d2344a0175569017503f87f560c1b75623dba968fbe7562bf39a" exitCode=0 Mar 19 11:54:31.384833 master-0 kubenswrapper[7642]: I0319 11:54:31.384701 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerDied","Data":"63b90693b783d2344a0175569017503f87f560c1b75623dba968fbe7562bf39a"} Mar 19 11:54:31.385356 master-0 kubenswrapper[7642]: I0319 11:54:31.385333 7642 scope.go:117] "RemoveContainer" containerID="63b90693b783d2344a0175569017503f87f560c1b75623dba968fbe7562bf39a" Mar 19 11:54:31.549556 master-0 kubenswrapper[7642]: I0319 11:54:31.549490 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fd69bc8f7-bd8js"] Mar 19 11:54:31.586722 master-0 kubenswrapper[7642]: I0319 11:54:31.585520 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b48j\" (UniqueName: \"kubernetes.io/projected/ac8349ad-5063-439f-9785-fa90df81836b-kube-api-access-6b48j\") pod \"controller-manager-69dc5c9779-942tz\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:31.639987 master-0 kubenswrapper[7642]: I0319 11:54:31.639847 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee60c527-0fff-43d5-825d-230971422b38-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:31.639987 master-0 kubenswrapper[7642]: I0319 11:54:31.639880 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee60c527-0fff-43d5-825d-230971422b38-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:31.711799 master-0 kubenswrapper[7642]: I0319 11:54:31.711713 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:32.075053 master-0 kubenswrapper[7642]: I0319 11:54:32.074565 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee60c527-0fff-43d5-825d-230971422b38" path="/var/lib/kubelet/pods/ee60c527-0fff-43d5-825d-230971422b38/volumes" Mar 19 11:54:32.393143 master-0 kubenswrapper[7642]: I0319 11:54:32.392977 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerStarted","Data":"befd94b8f19fd722cf8884718907d3b2c42d5f4a6e300a5487fce3bd4b878dd3"} Mar 19 11:54:33.084472 master-0 kubenswrapper[7642]: I0319 11:54:33.084389 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75"] Mar 19 11:54:33.087137 master-0 kubenswrapper[7642]: I0319 11:54:33.087092 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.088416 master-0 kubenswrapper[7642]: I0319 11:54:33.088343 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69dc5c9779-942tz"] Mar 19 11:54:33.095800 master-0 kubenswrapper[7642]: I0319 11:54:33.091201 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:54:33.095800 master-0 kubenswrapper[7642]: I0319 11:54:33.092019 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:33.098011 master-0 kubenswrapper[7642]: I0319 11:54:33.096539 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:54:33.098011 master-0 kubenswrapper[7642]: I0319 11:54:33.097089 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:33.098011 master-0 kubenswrapper[7642]: I0319 11:54:33.097658 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:54:33.177489 master-0 kubenswrapper[7642]: I0319 11:54:33.177365 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-config\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.177489 master-0 kubenswrapper[7642]: I0319 11:54:33.177430 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.178126 master-0 kubenswrapper[7642]: I0319 11:54:33.177969 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fpfl\" (UniqueName: \"kubernetes.io/projected/00d5b0b0-3f0c-4136-9801-631b282999a7-kube-api-access-8fpfl\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.178386 master-0 kubenswrapper[7642]: I0319 11:54:33.178338 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-client-ca\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.280079 master-0 kubenswrapper[7642]: I0319 11:54:33.279997 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-client-ca\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.280522 master-0 kubenswrapper[7642]: I0319 11:54:33.280492 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-config\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.280762 master-0 kubenswrapper[7642]: I0319 11:54:33.280733 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.281011 master-0 kubenswrapper[7642]: I0319 11:54:33.280981 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fpfl\" (UniqueName: \"kubernetes.io/projected/00d5b0b0-3f0c-4136-9801-631b282999a7-kube-api-access-8fpfl\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.281191 master-0 kubenswrapper[7642]: I0319 11:54:33.281165 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") pod \"apiserver-ff96cdc79-z9zjw\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:33.281524 master-0 kubenswrapper[7642]: I0319 11:54:33.281468 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-client-ca\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.281682 master-0 kubenswrapper[7642]: E0319 11:54:33.281614 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:33.281793 master-0 kubenswrapper[7642]: E0319 11:54:33.281703 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert podName:00d5b0b0-3f0c-4136-9801-631b282999a7 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:33.781682062 +0000 UTC m=+31.899122869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert") pod "route-controller-manager-849695ddd4-dgp75" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7") : secret "serving-cert" not found Mar 19 11:54:33.282179 master-0 kubenswrapper[7642]: E0319 11:54:33.282148 7642 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 19 11:54:33.282401 master-0 kubenswrapper[7642]: E0319 11:54:33.282377 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client podName:d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:41.28234447 +0000 UTC m=+39.399785337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client") pod "apiserver-ff96cdc79-z9zjw" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26") : secret "etcd-client" not found Mar 19 11:54:33.282538 master-0 kubenswrapper[7642]: I0319 11:54:33.282474 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-config\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.400220 master-0 kubenswrapper[7642]: I0319 11:54:33.400154 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" event={"ID":"ac8349ad-5063-439f-9785-fa90df81836b","Type":"ContainerStarted","Data":"a651f85f636acf31bd26274b37214b661d01e3f97dd8c21150a25f6945b01729"} Mar 19 11:54:33.464018 master-0 kubenswrapper[7642]: I0319 11:54:33.463943 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp"] Mar 19 11:54:33.464912 master-0 kubenswrapper[7642]: I0319 11:54:33.464853 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75"] Mar 19 11:54:33.590811 master-0 kubenswrapper[7642]: I0319 11:54:33.590347 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-ff96cdc79-z9zjw"] Mar 19 11:54:33.591420 master-0 kubenswrapper[7642]: E0319 11:54:33.591387 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[etcd-client], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" podUID="d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" Mar 19 11:54:33.602972 master-0 kubenswrapper[7642]: I0319 11:54:33.602926 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86458b9cf7-7kkbp"] Mar 19 11:54:33.642697 master-0 kubenswrapper[7642]: I0319 11:54:33.642625 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fpfl\" (UniqueName: \"kubernetes.io/projected/00d5b0b0-3f0c-4136-9801-631b282999a7-kube-api-access-8fpfl\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.692615 master-0 kubenswrapper[7642]: I0319 11:54:33.689921 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-7t9wr"] Mar 19 11:54:33.692615 master-0 kubenswrapper[7642]: I0319 11:54:33.690575 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.692615 master-0 kubenswrapper[7642]: I0319 11:54:33.691673 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3ceff4-e4f6-4149-b31b-bbce53d704be-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:33.692615 master-0 kubenswrapper[7642]: I0319 11:54:33.691711 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3ceff4-e4f6-4149-b31b-bbce53d704be-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:33.698709 master-0 kubenswrapper[7642]: I0319 11:54:33.698692 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 11:54:33.699361 master-0 kubenswrapper[7642]: I0319 11:54:33.698874 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 11:54:33.699525 master-0 kubenswrapper[7642]: I0319 11:54:33.699019 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 11:54:33.699754 master-0 kubenswrapper[7642]: I0319 11:54:33.699124 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 11:54:33.700166 master-0 kubenswrapper[7642]: I0319 11:54:33.699153 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 11:54:33.700568 master-0 kubenswrapper[7642]: I0319 11:54:33.699178 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 11:54:33.700568 master-0 kubenswrapper[7642]: I0319 11:54:33.699197 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 11:54:33.700568 master-0 kubenswrapper[7642]: I0319 11:54:33.699228 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 11:54:33.707266 master-0 kubenswrapper[7642]: I0319 11:54:33.707247 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-7t9wr"] Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.794839 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5qnw\" (UniqueName: \"kubernetes.io/projected/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-kube-api-access-s5qnw\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.794910 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-trusted-ca-bundle\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795075 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-serving-ca\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795132 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-dir\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795205 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-policies\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795242 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795287 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795325 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-encryption-config\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: I0319 11:54:33.795431 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-client\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: E0319 11:54:33.795782 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:33.799682 master-0 kubenswrapper[7642]: E0319 11:54:33.795884 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert podName:00d5b0b0-3f0c-4136-9801-631b282999a7 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.79586481 +0000 UTC m=+32.913305597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert") pod "route-controller-manager-849695ddd4-dgp75" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7") : secret "serving-cert" not found Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898333 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5qnw\" (UniqueName: \"kubernetes.io/projected/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-kube-api-access-s5qnw\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898389 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-trusted-ca-bundle\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898438 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-serving-ca\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898459 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-dir\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898484 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-policies\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898505 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898534 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-encryption-config\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.898660 master-0 kubenswrapper[7642]: I0319 11:54:33.898583 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-client\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.899116 master-0 kubenswrapper[7642]: I0319 11:54:33.898935 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-dir\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.906654 master-0 kubenswrapper[7642]: E0319 11:54:33.899534 7642 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:54:33.906654 master-0 kubenswrapper[7642]: E0319 11:54:33.899745 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert podName:03c3e2ba-8810-48f9-9c77-7597cc2ea6c5 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:34.399678279 +0000 UTC m=+32.517119246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert") pod "apiserver-5547669f67-7t9wr" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5") : secret "serving-cert" not found Mar 19 11:54:33.906654 master-0 kubenswrapper[7642]: I0319 11:54:33.900154 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-trusted-ca-bundle\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.906654 master-0 kubenswrapper[7642]: I0319 11:54:33.900781 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-serving-ca\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.906654 master-0 kubenswrapper[7642]: I0319 11:54:33.901088 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-policies\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.906654 master-0 kubenswrapper[7642]: I0319 11:54:33.905148 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-client\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.914998 master-0 kubenswrapper[7642]: I0319 11:54:33.910055 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-encryption-config\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:33.948849 master-0 kubenswrapper[7642]: I0319 11:54:33.947506 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5qnw\" (UniqueName: \"kubernetes.io/projected/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-kube-api-access-s5qnw\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:34.076508 master-0 kubenswrapper[7642]: I0319 11:54:34.076434 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3ceff4-e4f6-4149-b31b-bbce53d704be" path="/var/lib/kubelet/pods/9c3ceff4-e4f6-4149-b31b-bbce53d704be/volumes" Mar 19 11:54:34.404200 master-0 kubenswrapper[7642]: I0319 11:54:34.404156 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:34.406239 master-0 kubenswrapper[7642]: I0319 11:54:34.406046 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:34.406328 master-0 kubenswrapper[7642]: E0319 11:54:34.406249 7642 secret.go:189] Couldn't get secret openshift-oauth-apiserver/serving-cert: secret "serving-cert" not found Mar 19 11:54:34.406328 master-0 kubenswrapper[7642]: E0319 11:54:34.406319 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert podName:03c3e2ba-8810-48f9-9c77-7597cc2ea6c5 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:35.406300737 +0000 UTC m=+33.523741524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert") pod "apiserver-5547669f67-7t9wr" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5") : secret "serving-cert" not found Mar 19 11:54:34.411824 master-0 kubenswrapper[7642]: I0319 11:54:34.411761 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:34.507265 master-0 kubenswrapper[7642]: I0319 11:54:34.507205 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-node-pullsecrets\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507265 master-0 kubenswrapper[7642]: I0319 11:54:34.507265 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-trusted-ca-bundle\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507345 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-config\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507396 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit-dir\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507440 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdf5k\" (UniqueName: \"kubernetes.io/projected/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-kube-api-access-vdf5k\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507477 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-image-import-ca\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507506 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-encryption-config\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507544 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507586 master-0 kubenswrapper[7642]: I0319 11:54:34.507584 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-serving-cert\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507812 master-0 kubenswrapper[7642]: I0319 11:54:34.507662 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") pod \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\" (UID: \"d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26\") " Mar 19 11:54:34.507812 master-0 kubenswrapper[7642]: I0319 11:54:34.507772 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:34.508141 master-0 kubenswrapper[7642]: I0319 11:54:34.508103 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:34.508556 master-0 kubenswrapper[7642]: I0319 11:54:34.508515 7642 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.508590 master-0 kubenswrapper[7642]: I0319 11:54:34.508560 7642 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.508698 master-0 kubenswrapper[7642]: I0319 11:54:34.508608 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-config" (OuterVolumeSpecName: "config") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:34.508760 master-0 kubenswrapper[7642]: I0319 11:54:34.508625 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:34.509335 master-0 kubenswrapper[7642]: I0319 11:54:34.509308 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:34.509335 master-0 kubenswrapper[7642]: I0319 11:54:34.509322 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit" (OuterVolumeSpecName: "audit") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:34.509813 master-0 kubenswrapper[7642]: I0319 11:54:34.509742 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:34.512831 master-0 kubenswrapper[7642]: I0319 11:54:34.512779 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:34.513300 master-0 kubenswrapper[7642]: I0319 11:54:34.513263 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-kube-api-access-vdf5k" (OuterVolumeSpecName: "kube-api-access-vdf5k") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "kube-api-access-vdf5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:34.516687 master-0 kubenswrapper[7642]: I0319 11:54:34.515940 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" (UID: "d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609546 7642 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609598 7642 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609613 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609626 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdf5k\" (UniqueName: \"kubernetes.io/projected/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-kube-api-access-vdf5k\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609651 7642 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609662 7642 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609672 7642 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-audit\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.609682 master-0 kubenswrapper[7642]: I0319 11:54:34.609681 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:34.812540 master-0 kubenswrapper[7642]: I0319 11:54:34.812390 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:34.812782 master-0 kubenswrapper[7642]: E0319 11:54:34.812763 7642 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 19 11:54:34.812863 master-0 kubenswrapper[7642]: E0319 11:54:34.812835 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert podName:00d5b0b0-3f0c-4136-9801-631b282999a7 nodeName:}" failed. No retries permitted until 2026-03-19 11:54:36.812813359 +0000 UTC m=+34.930254146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert") pod "route-controller-manager-849695ddd4-dgp75" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7") : secret "serving-cert" not found Mar 19 11:54:34.913492 master-0 kubenswrapper[7642]: I0319 11:54:34.913424 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:34.913492 master-0 kubenswrapper[7642]: I0319 11:54:34.913482 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:34.917996 master-0 kubenswrapper[7642]: I0319 11:54:34.917959 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:34.967701 master-0 kubenswrapper[7642]: I0319 11:54:34.943446 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.018837 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.018967 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019000 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019032 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019114 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019190 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019230 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019262 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019300 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019880 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019929 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.019964 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:35.020112 master-0 kubenswrapper[7642]: I0319 11:54:35.020011 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:35.021403 master-0 kubenswrapper[7642]: E0319 11:54:35.021345 7642 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 19 11:54:35.021517 master-0 kubenswrapper[7642]: E0319 11:54:35.021427 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert podName:94fa5656-0561-45ab-9ab1-a6b96b8a47d4 nodeName:}" failed. No retries permitted until 2026-03-19 11:55:07.021402984 +0000 UTC m=+65.138843781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert") pod "package-server-manager-7b95f86987-ftz5b" (UID: "94fa5656-0561-45ab-9ab1-a6b96b8a47d4") : secret "package-server-manager-serving-cert" not found Mar 19 11:54:35.026545 master-0 kubenswrapper[7642]: I0319 11:54:35.026356 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:35.026545 master-0 kubenswrapper[7642]: I0319 11:54:35.026421 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:35.026545 master-0 kubenswrapper[7642]: I0319 11:54:35.026447 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:35.026545 master-0 kubenswrapper[7642]: I0319 11:54:35.026487 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:35.026836 master-0 kubenswrapper[7642]: I0319 11:54:35.026806 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:35.026898 master-0 kubenswrapper[7642]: I0319 11:54:35.026811 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:35.026898 master-0 kubenswrapper[7642]: I0319 11:54:35.026811 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:35.027113 master-0 kubenswrapper[7642]: I0319 11:54:35.026817 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"multus-admission-controller-5dbbb8b86f-z7wgn\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:35.028057 master-0 kubenswrapper[7642]: I0319 11:54:35.028015 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:35.028479 master-0 kubenswrapper[7642]: I0319 11:54:35.028441 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"cluster-version-operator-56d8475767-nvbgc\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:35.028532 master-0 kubenswrapper[7642]: I0319 11:54:35.028469 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:35.031684 master-0 kubenswrapper[7642]: I0319 11:54:35.031624 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:35.128963 master-0 kubenswrapper[7642]: I0319 11:54:35.128790 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 11:54:35.128963 master-0 kubenswrapper[7642]: I0319 11:54:35.128919 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:35.129205 master-0 kubenswrapper[7642]: I0319 11:54:35.128953 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 11:54:35.132514 master-0 kubenswrapper[7642]: I0319 11:54:35.132464 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:54:35.141311 master-0 kubenswrapper[7642]: I0319 11:54:35.139130 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:54:35.141311 master-0 kubenswrapper[7642]: I0319 11:54:35.140275 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:35.141311 master-0 kubenswrapper[7642]: I0319 11:54:35.140347 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 11:54:35.141311 master-0 kubenswrapper[7642]: I0319 11:54:35.140788 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 11:54:35.142035 master-0 kubenswrapper[7642]: I0319 11:54:35.142004 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 11:54:35.146324 master-0 kubenswrapper[7642]: I0319 11:54:35.146279 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 11:54:35.146414 master-0 kubenswrapper[7642]: I0319 11:54:35.146330 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:35.146414 master-0 kubenswrapper[7642]: I0319 11:54:35.146361 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 11:54:35.378356 master-0 kubenswrapper[7642]: I0319 11:54:35.377874 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg"] Mar 19 11:54:35.409053 master-0 kubenswrapper[7642]: I0319 11:54:35.408991 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" event={"ID":"60f846d9-1b3d-4b9d-b3e0-90a238e02640","Type":"ContainerStarted","Data":"8e4e3b487bb0051c9d1f17e8f20f2f36c1feaa2ba16b77ca2fa3242c79022839"} Mar 19 11:54:35.409053 master-0 kubenswrapper[7642]: I0319 11:54:35.409018 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-ff96cdc79-z9zjw" Mar 19 11:54:35.430447 master-0 kubenswrapper[7642]: I0319 11:54:35.430387 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:35.665307 master-0 kubenswrapper[7642]: I0319 11:54:35.665259 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") pod \"apiserver-5547669f67-7t9wr\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:35.817052 master-0 kubenswrapper[7642]: I0319 11:54:35.816806 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-ff96cdc79-z9zjw"] Mar 19 11:54:35.817323 master-0 kubenswrapper[7642]: I0319 11:54:35.817088 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-ff96cdc79-z9zjw"] Mar 19 11:54:35.827452 master-0 kubenswrapper[7642]: I0319 11:54:35.827402 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:35.829019 master-0 kubenswrapper[7642]: I0319 11:54:35.828788 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-66b84d69b-59ptg"] Mar 19 11:54:35.837268 master-0 kubenswrapper[7642]: I0319 11:54:35.837213 7642 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:35.859330 master-0 kubenswrapper[7642]: I0319 11:54:35.858686 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj"] Mar 19 11:54:35.872544 master-0 kubenswrapper[7642]: I0319 11:54:35.872470 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn"] Mar 19 11:54:35.955529 master-0 kubenswrapper[7642]: I0319 11:54:35.955004 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb"] Mar 19 11:54:35.989921 master-0 kubenswrapper[7642]: I0319 11:54:35.989856 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4"] Mar 19 11:54:36.074954 master-0 kubenswrapper[7642]: I0319 11:54:36.074868 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26" path="/var/lib/kubelet/pods/d611cd35-8f2c-4fbc-8f9d-ebe9223e1a26/volumes" Mar 19 11:54:36.101621 master-0 kubenswrapper[7642]: I0319 11:54:36.100901 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-9c5679d8f-m5dt4"] Mar 19 11:54:36.102976 master-0 kubenswrapper[7642]: I0319 11:54:36.102263 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-89ccd998f-4h6xn"] Mar 19 11:54:36.103548 master-0 kubenswrapper[7642]: I0319 11:54:36.103512 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nkd77"] Mar 19 11:54:36.165216 master-0 kubenswrapper[7642]: I0319 11:54:36.162352 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b"] Mar 19 11:54:36.165216 master-0 kubenswrapper[7642]: I0319 11:54:36.164299 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf"] Mar 19 11:54:36.648611 master-0 kubenswrapper[7642]: I0319 11:54:36.647720 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-66dcf44d9d-46w4r"] Mar 19 11:54:36.649487 master-0 kubenswrapper[7642]: I0319 11:54:36.648699 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.653677 master-0 kubenswrapper[7642]: I0319 11:54:36.652920 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 11:54:36.653789 master-0 kubenswrapper[7642]: I0319 11:54:36.653710 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 11:54:36.654686 master-0 kubenswrapper[7642]: I0319 11:54:36.654005 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 11:54:36.654686 master-0 kubenswrapper[7642]: I0319 11:54:36.654141 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 11:54:36.654686 master-0 kubenswrapper[7642]: I0319 11:54:36.654360 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 11:54:36.654686 master-0 kubenswrapper[7642]: I0319 11:54:36.654371 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 11:54:36.657347 master-0 kubenswrapper[7642]: I0319 11:54:36.654788 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 11:54:36.657347 master-0 kubenswrapper[7642]: I0319 11:54:36.655410 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 11:54:36.657745 master-0 kubenswrapper[7642]: I0319 11:54:36.657707 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 11:54:36.666570 master-0 kubenswrapper[7642]: I0319 11:54:36.660231 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 11:54:36.669684 master-0 kubenswrapper[7642]: I0319 11:54:36.669628 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-66dcf44d9d-46w4r"] Mar 19 11:54:36.756993 master-0 kubenswrapper[7642]: I0319 11:54:36.756922 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-audit\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.756993 master-0 kubenswrapper[7642]: I0319 11:54:36.756970 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-client\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.756993 master-0 kubenswrapper[7642]: I0319 11:54:36.757006 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4gg\" (UniqueName: \"kubernetes.io/projected/7af28c70-3ee0-438b-8655-95bc57deaa05-kube-api-access-2m4gg\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757029 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757048 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-image-import-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757068 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-node-pullsecrets\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757114 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-serving-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757138 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-serving-cert\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757211 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-audit-dir\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757298 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-trusted-ca-bundle\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.757711 master-0 kubenswrapper[7642]: I0319 11:54:36.757324 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-encryption-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859206 master-0 kubenswrapper[7642]: I0319 11:54:36.859119 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-client\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859455 master-0 kubenswrapper[7642]: I0319 11:54:36.859214 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4gg\" (UniqueName: \"kubernetes.io/projected/7af28c70-3ee0-438b-8655-95bc57deaa05-kube-api-access-2m4gg\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859455 master-0 kubenswrapper[7642]: I0319 11:54:36.859261 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859455 master-0 kubenswrapper[7642]: I0319 11:54:36.859301 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-image-import-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859455 master-0 kubenswrapper[7642]: I0319 11:54:36.859356 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-node-pullsecrets\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859455 master-0 kubenswrapper[7642]: I0319 11:54:36.859389 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-serving-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859455 master-0 kubenswrapper[7642]: I0319 11:54:36.859439 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-serving-cert\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859713 master-0 kubenswrapper[7642]: I0319 11:54:36.859480 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-audit-dir\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859713 master-0 kubenswrapper[7642]: I0319 11:54:36.859520 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-trusted-ca-bundle\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859713 master-0 kubenswrapper[7642]: I0319 11:54:36.859561 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-encryption-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.859713 master-0 kubenswrapper[7642]: I0319 11:54:36.859600 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:36.859713 master-0 kubenswrapper[7642]: I0319 11:54:36.859659 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-audit\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.861119 master-0 kubenswrapper[7642]: I0319 11:54:36.861071 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-audit\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.861396 master-0 kubenswrapper[7642]: I0319 11:54:36.861346 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-serving-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.862370 master-0 kubenswrapper[7642]: I0319 11:54:36.861644 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-audit-dir\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.862370 master-0 kubenswrapper[7642]: I0319 11:54:36.861797 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-node-pullsecrets\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.862370 master-0 kubenswrapper[7642]: I0319 11:54:36.862246 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-image-import-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.862865 master-0 kubenswrapper[7642]: I0319 11:54:36.862838 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.863284 master-0 kubenswrapper[7642]: I0319 11:54:36.863232 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-trusted-ca-bundle\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.869227 master-0 kubenswrapper[7642]: I0319 11:54:36.868760 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"route-controller-manager-849695ddd4-dgp75\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:36.869227 master-0 kubenswrapper[7642]: I0319 11:54:36.869169 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-client\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.869465 master-0 kubenswrapper[7642]: I0319 11:54:36.869247 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-serving-cert\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.887255 master-0 kubenswrapper[7642]: I0319 11:54:36.882660 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-encryption-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.892517 master-0 kubenswrapper[7642]: I0319 11:54:36.892482 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4gg\" (UniqueName: \"kubernetes.io/projected/7af28c70-3ee0-438b-8655-95bc57deaa05-kube-api-access-2m4gg\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:36.992627 master-0 kubenswrapper[7642]: I0319 11:54:36.992460 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:54:37.016452 master-0 kubenswrapper[7642]: W0319 11:54:37.016174 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae2d717_506b_4d8e_803b_689c1cc3557c.slice/crio-6e6934bb1122fb83174516e208b7a3ad1b4497aff55eda2c80c1a4c886b31150 WatchSource:0}: Error finding container 6e6934bb1122fb83174516e208b7a3ad1b4497aff55eda2c80c1a4c886b31150: Status 404 returned error can't find the container with id 6e6934bb1122fb83174516e208b7a3ad1b4497aff55eda2c80c1a4c886b31150 Mar 19 11:54:37.019951 master-0 kubenswrapper[7642]: W0319 11:54:37.019898 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38df702_a256_4f83_90bb_0c0e477a2ce6.slice/crio-ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c WatchSource:0}: Error finding container ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c: Status 404 returned error can't find the container with id ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c Mar 19 11:54:37.020907 master-0 kubenswrapper[7642]: W0319 11:54:37.020764 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e1b71a7_c9fd_4f03_8d0c_f092dac80989.slice/crio-e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780 WatchSource:0}: Error finding container e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780: Status 404 returned error can't find the container with id e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780 Mar 19 11:54:37.033589 master-0 kubenswrapper[7642]: W0319 11:54:37.033460 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39f479b3_45ff_43f9_ae85_083554a54918.slice/crio-d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8 WatchSource:0}: Error finding container d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8: Status 404 returned error can't find the container with id d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8 Mar 19 11:54:37.035902 master-0 kubenswrapper[7642]: W0319 11:54:37.035865 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f6e54f_a750_4c91_b4f5_049275d33cfb.slice/crio-768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab WatchSource:0}: Error finding container 768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab: Status 404 returned error can't find the container with id 768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab Mar 19 11:54:37.038541 master-0 kubenswrapper[7642]: W0319 11:54:37.038509 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b46c6b4_f701_4a7f_bf74_90afe14ad89a.slice/crio-fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af WatchSource:0}: Error finding container fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af: Status 404 returned error can't find the container with id fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af Mar 19 11:54:37.043182 master-0 kubenswrapper[7642]: W0319 11:54:37.043140 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8c4676_05af_4478_a44b_1bdcbefdea0f.slice/crio-c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f WatchSource:0}: Error finding container c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f: Status 404 returned error can't find the container with id c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f Mar 19 11:54:37.044447 master-0 kubenswrapper[7642]: W0319 11:54:37.044150 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90c10f6_36d9_44bd_a4c6_174f12135434.slice/crio-cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a WatchSource:0}: Error finding container cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a: Status 404 returned error can't find the container with id cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a Mar 19 11:54:37.045528 master-0 kubenswrapper[7642]: I0319 11:54:37.045494 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:37.047400 master-0 kubenswrapper[7642]: W0319 11:54:37.047171 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131e2573_ef74_4963_bdcf_901310e7a486.slice/crio-6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3 WatchSource:0}: Error finding container 6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3: Status 404 returned error can't find the container with id 6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3 Mar 19 11:54:37.049133 master-0 kubenswrapper[7642]: W0319 11:54:37.049090 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07804be1_d37c_474e_a179_01ac541d9705.slice/crio-8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1 WatchSource:0}: Error finding container 8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1: Status 404 returned error can't find the container with id 8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1 Mar 19 11:54:37.074742 master-0 kubenswrapper[7642]: E0319 11:54:37.074565 7642 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cluster-image-registry-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89,Command:[],Args:[--files=/var/run/configmaps/trusted-ca/tls-ca-bundle.pem --files=/etc/secrets/tls.crt --files=/etc/secrets/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.35,ValueFrom:nil,},EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:cluster-image-registry-operator,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89,ValueFrom:nil,},EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:60637f6eed5e9adc3af1863d0ef311c74b9109f00f464f9ce6cdfd21d0ee4608,ValueFrom:nil,},EnvVar{Name:IMAGE_PRUNER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55,ValueFrom:nil,},EnvVar{Name:AZURE_ENVIRONMENT_FILEPATH,Value:/tmp/azurestackcloud.json,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.35,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:trusted-ca,ReadOnly:false,MountPath:/var/run/configmaps/trusted-ca/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:image-registry-operator-tls,ReadOnly:false,MountPath:/etc/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:bound-sa-token,ReadOnly:true,MountPath:/var/run/secrets/openshift/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwp24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-image-registry-operator-5549dc66cb-5twg4_openshift-image-registry(07804be1-d37c-474e-a179-01ac541d9705): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 11:54:37.076856 master-0 kubenswrapper[7642]: E0319 11:54:37.076804 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-image-registry-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" podUID="07804be1-d37c-474e-a179-01ac541d9705" Mar 19 11:54:37.320308 master-0 kubenswrapper[7642]: I0319 11:54:37.317559 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-7t9wr"] Mar 19 11:54:37.341357 master-0 kubenswrapper[7642]: W0319 11:54:37.339356 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c3e2ba_8810_48f9_9c77_7597cc2ea6c5.slice/crio-0256e346ceb50747334fd7b0822c0dd8b7fa9973de10430918b7fc4f8df2e700 WatchSource:0}: Error finding container 0256e346ceb50747334fd7b0822c0dd8b7fa9973de10430918b7fc4f8df2e700: Status 404 returned error can't find the container with id 0256e346ceb50747334fd7b0822c0dd8b7fa9973de10430918b7fc4f8df2e700 Mar 19 11:54:37.376374 master-0 kubenswrapper[7642]: I0319 11:54:37.376301 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75"] Mar 19 11:54:37.390036 master-0 kubenswrapper[7642]: W0319 11:54:37.389319 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00d5b0b0_3f0c_4136_9801_631b282999a7.slice/crio-8223190fa0d9cc91deaaba666f0a19e363838137553ad3e78caf3ebe08515ebd WatchSource:0}: Error finding container 8223190fa0d9cc91deaaba666f0a19e363838137553ad3e78caf3ebe08515ebd: Status 404 returned error can't find the container with id 8223190fa0d9cc91deaaba666f0a19e363838137553ad3e78caf3ebe08515ebd Mar 19 11:54:37.393199 master-0 kubenswrapper[7642]: E0319 11:54:37.391532 7642 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:route-controller-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b,Command:[route-controller-manager start],Args:[--config=/var/run/configmaps/config/config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:client-ca,ReadOnly:false,MountPath:/var/run/configmaps/client-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8fpfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000580000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod route-controller-manager-849695ddd4-dgp75_openshift-route-controller-manager(00d5b0b0-3f0c-4136-9801-631b282999a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 11:54:37.393199 master-0 kubenswrapper[7642]: E0319 11:54:37.392969 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" podUID="00d5b0b0-3f0c-4136-9801-631b282999a7" Mar 19 11:54:37.412362 master-0 kubenswrapper[7642]: I0319 11:54:37.412296 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-66dcf44d9d-46w4r"] Mar 19 11:54:37.426680 master-0 kubenswrapper[7642]: I0319 11:54:37.422302 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" event={"ID":"ac8349ad-5063-439f-9785-fa90df81836b","Type":"ContainerStarted","Data":"10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a"} Mar 19 11:54:37.426680 master-0 kubenswrapper[7642]: I0319 11:54:37.423707 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:37.429867 master-0 kubenswrapper[7642]: I0319 11:54:37.429692 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkd77" event={"ID":"3e1b71a7-c9fd-4f03-8d0c-f092dac80989","Type":"ContainerStarted","Data":"e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780"} Mar 19 11:54:37.429867 master-0 kubenswrapper[7642]: I0319 11:54:37.429801 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:37.431709 master-0 kubenswrapper[7642]: W0319 11:54:37.431672 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af28c70_3ee0_438b_8655_95bc57deaa05.slice/crio-76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3 WatchSource:0}: Error finding container 76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3: Status 404 returned error can't find the container with id 76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3 Mar 19 11:54:37.432032 master-0 kubenswrapper[7642]: I0319 11:54:37.431978 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" event={"ID":"5ae2d717-506b-4d8e-803b-689c1cc3557c","Type":"ContainerStarted","Data":"6e6934bb1122fb83174516e208b7a3ad1b4497aff55eda2c80c1a4c886b31150"} Mar 19 11:54:37.433465 master-0 kubenswrapper[7642]: I0319 11:54:37.433421 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" event={"ID":"33f6e54f-a750-4c91-b4f5-049275d33cfb","Type":"ContainerStarted","Data":"768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab"} Mar 19 11:54:37.452886 master-0 kubenswrapper[7642]: I0319 11:54:37.452767 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" event={"ID":"00d5b0b0-3f0c-4136-9801-631b282999a7","Type":"ContainerStarted","Data":"8223190fa0d9cc91deaaba666f0a19e363838137553ad3e78caf3ebe08515ebd"} Mar 19 11:54:37.454809 master-0 kubenswrapper[7642]: E0319 11:54:37.453957 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"\"" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" podUID="00d5b0b0-3f0c-4136-9801-631b282999a7" Mar 19 11:54:37.456049 master-0 kubenswrapper[7642]: I0319 11:54:37.455854 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" event={"ID":"5b46c6b4-f701-4a7f-bf74-90afe14ad89a","Type":"ContainerStarted","Data":"fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af"} Mar 19 11:54:37.460392 master-0 kubenswrapper[7642]: I0319 11:54:37.460319 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" podStartSLOduration=6.4655078249999995 podStartE2EDuration="10.46029811s" podCreationTimestamp="2026-03-19 11:54:27 +0000 UTC" firstStartedPulling="2026-03-19 11:54:33.115073419 +0000 UTC m=+31.232514236" lastFinishedPulling="2026-03-19 11:54:37.109863734 +0000 UTC m=+35.227304521" observedRunningTime="2026-03-19 11:54:37.456825693 +0000 UTC m=+35.574266480" watchObservedRunningTime="2026-03-19 11:54:37.46029811 +0000 UTC m=+35.577738897" Mar 19 11:54:37.466072 master-0 kubenswrapper[7642]: I0319 11:54:37.465955 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" event={"ID":"07804be1-d37c-474e-a179-01ac541d9705","Type":"ContainerStarted","Data":"8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1"} Mar 19 11:54:37.495707 master-0 kubenswrapper[7642]: E0319 11:54:37.495628 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-image-registry-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"\"" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" podUID="07804be1-d37c-474e-a179-01ac541d9705" Mar 19 11:54:37.497623 master-0 kubenswrapper[7642]: I0319 11:54:37.497234 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c"} Mar 19 11:54:37.499888 master-0 kubenswrapper[7642]: I0319 11:54:37.499751 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a"} Mar 19 11:54:37.502383 master-0 kubenswrapper[7642]: I0319 11:54:37.502326 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" event={"ID":"6e8c4676-05af-4478-a44b-1bdcbefdea0f","Type":"ContainerStarted","Data":"c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f"} Mar 19 11:54:37.505095 master-0 kubenswrapper[7642]: I0319 11:54:37.504498 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" event={"ID":"78d3b2df-3d81-413e-b039-80dc20111eae","Type":"ContainerStarted","Data":"b8b47e7cae81e2e8fc5d1fdd94e1150be80c99269a023622d4d29934167c52d2"} Mar 19 11:54:37.514252 master-0 kubenswrapper[7642]: I0319 11:54:37.513689 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" event={"ID":"131e2573-ef74-4963-bdcf-901310e7a486","Type":"ContainerStarted","Data":"6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3"} Mar 19 11:54:37.520851 master-0 kubenswrapper[7642]: I0319 11:54:37.519364 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" event={"ID":"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5","Type":"ContainerStarted","Data":"0256e346ceb50747334fd7b0822c0dd8b7fa9973de10430918b7fc4f8df2e700"} Mar 19 11:54:37.539458 master-0 kubenswrapper[7642]: I0319 11:54:37.539392 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerStarted","Data":"d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8"} Mar 19 11:54:38.546873 master-0 kubenswrapper[7642]: I0319 11:54:38.546794 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerStarted","Data":"76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3"} Mar 19 11:54:38.548868 master-0 kubenswrapper[7642]: E0319 11:54:38.548830 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"\"" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" podUID="00d5b0b0-3f0c-4136-9801-631b282999a7" Mar 19 11:54:38.549483 master-0 kubenswrapper[7642]: E0319 11:54:38.549425 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-image-registry-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"\"" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" podUID="07804be1-d37c-474e-a179-01ac541d9705" Mar 19 11:54:39.356443 master-0 kubenswrapper[7642]: I0319 11:54:39.356227 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:54:39.357902 master-0 kubenswrapper[7642]: I0319 11:54:39.356504 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="6660a89a-cb33-4e20-aa58-93c4838e44c3" containerName="installer" containerID="cri-o://5db8b9dc9974fa7fae69b5ed92758705aaeaec268deaa769d2d9d84f72d9569a" gracePeriod=30 Mar 19 11:54:39.933549 master-0 kubenswrapper[7642]: I0319 11:54:39.928406 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-7t9wr"] Mar 19 11:54:41.853926 master-0 kubenswrapper[7642]: I0319 11:54:41.853741 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:41.855606 master-0 kubenswrapper[7642]: I0319 11:54:41.855572 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.861783 master-0 kubenswrapper[7642]: I0319 11:54:41.861719 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:41.893970 master-0 kubenswrapper[7642]: I0319 11:54:41.893906 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074a3b28-725c-4b11-b445-c5cf6511ffc8-kube-api-access\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.894107 master-0 kubenswrapper[7642]: I0319 11:54:41.893982 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.894107 master-0 kubenswrapper[7642]: I0319 11:54:41.894085 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-var-lock\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.994897 master-0 kubenswrapper[7642]: I0319 11:54:41.994842 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-var-lock\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.994897 master-0 kubenswrapper[7642]: I0319 11:54:41.994912 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074a3b28-725c-4b11-b445-c5cf6511ffc8-kube-api-access\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.995200 master-0 kubenswrapper[7642]: I0319 11:54:41.995047 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-var-lock\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.995246 master-0 kubenswrapper[7642]: I0319 11:54:41.995182 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:41.995301 master-0 kubenswrapper[7642]: I0319 11:54:41.995271 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:42.019523 master-0 kubenswrapper[7642]: I0319 11:54:42.019451 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074a3b28-725c-4b11-b445-c5cf6511ffc8-kube-api-access\") pod \"installer-2-master-0\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:42.180301 master-0 kubenswrapper[7642]: I0319 11:54:42.180255 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:43.098903 master-0 kubenswrapper[7642]: I0319 11:54:43.097803 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 11:54:43.103340 master-0 kubenswrapper[7642]: I0319 11:54:43.103290 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.103955 master-0 kubenswrapper[7642]: I0319 11:54:43.103907 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 11:54:43.110092 master-0 kubenswrapper[7642]: I0319 11:54:43.107745 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 11:54:43.209475 master-0 kubenswrapper[7642]: I0319 11:54:43.209413 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.209758 master-0 kubenswrapper[7642]: I0319 11:54:43.209497 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kube-api-access\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.209758 master-0 kubenswrapper[7642]: I0319 11:54:43.209530 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-var-lock\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.215141 master-0 kubenswrapper[7642]: I0319 11:54:43.215112 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 11:54:43.268051 master-0 kubenswrapper[7642]: I0319 11:54:43.267995 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:54:43.269019 master-0 kubenswrapper[7642]: I0319 11:54:43.268866 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.271236 master-0 kubenswrapper[7642]: I0319 11:54:43.271209 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:43.282823 master-0 kubenswrapper[7642]: I0319 11:54:43.281425 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:54:43.310576 master-0 kubenswrapper[7642]: I0319 11:54:43.310502 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kube-api-access\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.311125 master-0 kubenswrapper[7642]: I0319 11:54:43.310943 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-var-lock\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.311125 master-0 kubenswrapper[7642]: I0319 11:54:43.310996 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.311125 master-0 kubenswrapper[7642]: I0319 11:54:43.311077 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.311251 master-0 kubenswrapper[7642]: I0319 11:54:43.311215 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-var-lock\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.332520 master-0 kubenswrapper[7642]: I0319 11:54:43.332466 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kube-api-access\") pod \"installer-1-master-0\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.412671 master-0 kubenswrapper[7642]: I0319 11:54:43.412602 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.412960 master-0 kubenswrapper[7642]: I0319 11:54:43.412694 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-var-lock\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.412960 master-0 kubenswrapper[7642]: I0319 11:54:43.412731 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.427837 master-0 kubenswrapper[7642]: I0319 11:54:43.427790 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:54:43.514524 master-0 kubenswrapper[7642]: I0319 11:54:43.513990 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.514524 master-0 kubenswrapper[7642]: I0319 11:54:43.514062 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.514524 master-0 kubenswrapper[7642]: I0319 11:54:43.514104 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-var-lock\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.514524 master-0 kubenswrapper[7642]: I0319 11:54:43.514228 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.514524 master-0 kubenswrapper[7642]: I0319 11:54:43.514371 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-var-lock\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.536810 master-0 kubenswrapper[7642]: I0319 11:54:43.536754 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:43.600099 master-0 kubenswrapper[7642]: I0319 11:54:43.599992 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:54:44.034711 master-0 kubenswrapper[7642]: I0319 11:54:44.032712 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g"] Mar 19 11:54:44.045934 master-0 kubenswrapper[7642]: I0319 11:54:44.042436 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.046207 master-0 kubenswrapper[7642]: I0319 11:54:44.046153 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 11:54:44.046545 master-0 kubenswrapper[7642]: I0319 11:54:44.046517 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 11:54:44.055736 master-0 kubenswrapper[7642]: I0319 11:54:44.050245 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 11:54:44.055736 master-0 kubenswrapper[7642]: I0319 11:54:44.053764 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g"] Mar 19 11:54:44.070372 master-0 kubenswrapper[7642]: I0319 11:54:44.059867 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 11:54:44.121872 master-0 kubenswrapper[7642]: I0319 11:54:44.121822 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4"] Mar 19 11:54:44.123370 master-0 kubenswrapper[7642]: I0319 11:54:44.123346 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.127721 master-0 kubenswrapper[7642]: I0319 11:54:44.126107 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 11:54:44.130747 master-0 kubenswrapper[7642]: I0319 11:54:44.130158 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4"] Mar 19 11:54:44.131549 master-0 kubenswrapper[7642]: I0319 11:54:44.131524 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 11:54:44.134332 master-0 kubenswrapper[7642]: I0319 11:54:44.134277 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctghm\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-kube-api-access-ctghm\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.134449 master-0 kubenswrapper[7642]: I0319 11:54:44.134356 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/58a4e3af-c5e1-4487-92a2-81dfb696695e-cache\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.134449 master-0 kubenswrapper[7642]: I0319 11:54:44.134389 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.134449 master-0 kubenswrapper[7642]: I0319 11:54:44.134406 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.134449 master-0 kubenswrapper[7642]: I0319 11:54:44.134406 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 11:54:44.134705 master-0 kubenswrapper[7642]: I0319 11:54:44.134480 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjj2t\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-kube-api-access-gjj2t\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.134705 master-0 kubenswrapper[7642]: I0319 11:54:44.134518 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.134705 master-0 kubenswrapper[7642]: I0319 11:54:44.134618 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.134705 master-0 kubenswrapper[7642]: I0319 11:54:44.134692 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/60081e3b-4fe5-42a3-bfae-0b07853c0e36-cache\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.134886 master-0 kubenswrapper[7642]: I0319 11:54:44.134715 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.134886 master-0 kubenswrapper[7642]: I0319 11:54:44.134741 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/58a4e3af-c5e1-4487-92a2-81dfb696695e-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.134886 master-0 kubenswrapper[7642]: I0319 11:54:44.134790 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235481 master-0 kubenswrapper[7642]: I0319 11:54:44.235427 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/60081e3b-4fe5-42a3-bfae-0b07853c0e36-cache\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.235481 master-0 kubenswrapper[7642]: I0319 11:54:44.235476 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235507 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/58a4e3af-c5e1-4487-92a2-81dfb696695e-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235529 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235552 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctghm\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-kube-api-access-ctghm\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235589 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/58a4e3af-c5e1-4487-92a2-81dfb696695e-cache\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235613 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235648 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235679 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjj2t\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-kube-api-access-gjj2t\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235701 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.235811 master-0 kubenswrapper[7642]: I0319 11:54:44.235736 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.237954 master-0 kubenswrapper[7642]: I0319 11:54:44.236915 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.237954 master-0 kubenswrapper[7642]: I0319 11:54:44.237159 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.237954 master-0 kubenswrapper[7642]: I0319 11:54:44.237199 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.237954 master-0 kubenswrapper[7642]: I0319 11:54:44.237231 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.237954 master-0 kubenswrapper[7642]: I0319 11:54:44.237331 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/60081e3b-4fe5-42a3-bfae-0b07853c0e36-cache\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.237954 master-0 kubenswrapper[7642]: I0319 11:54:44.237378 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/58a4e3af-c5e1-4487-92a2-81dfb696695e-cache\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.241086 master-0 kubenswrapper[7642]: I0319 11:54:44.241017 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.241772 master-0 kubenswrapper[7642]: I0319 11:54:44.241749 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.251115 master-0 kubenswrapper[7642]: I0319 11:54:44.251073 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/58a4e3af-c5e1-4487-92a2-81dfb696695e-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.255349 master-0 kubenswrapper[7642]: I0319 11:54:44.255298 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjj2t\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-kube-api-access-gjj2t\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.257593 master-0 kubenswrapper[7642]: I0319 11:54:44.257462 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctghm\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-kube-api-access-ctghm\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:44.382774 master-0 kubenswrapper[7642]: I0319 11:54:44.381955 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:44.452724 master-0 kubenswrapper[7642]: I0319 11:54:44.448755 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:47.356701 master-0 kubenswrapper[7642]: I0319 11:54:47.356249 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69dc5c9779-942tz"] Mar 19 11:54:47.357497 master-0 kubenswrapper[7642]: I0319 11:54:47.356895 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" podUID="ac8349ad-5063-439f-9785-fa90df81836b" containerName="controller-manager" containerID="cri-o://10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a" gracePeriod=30 Mar 19 11:54:47.407476 master-0 kubenswrapper[7642]: I0319 11:54:47.407407 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75"] Mar 19 11:54:48.253116 master-0 kubenswrapper[7642]: I0319 11:54:48.253050 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:50.260018 master-0 kubenswrapper[7642]: I0319 11:54:50.259931 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:54:50.260995 master-0 kubenswrapper[7642]: I0319 11:54:50.260795 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.268300 master-0 kubenswrapper[7642]: I0319 11:54:50.268247 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:54:50.326425 master-0 kubenswrapper[7642]: I0319 11:54:50.326365 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kube-api-access\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.326425 master-0 kubenswrapper[7642]: I0319 11:54:50.326422 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-var-lock\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.326581 master-0 kubenswrapper[7642]: I0319 11:54:50.326439 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.427965 master-0 kubenswrapper[7642]: I0319 11:54:50.427866 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kube-api-access\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.427965 master-0 kubenswrapper[7642]: I0319 11:54:50.427937 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.427965 master-0 kubenswrapper[7642]: I0319 11:54:50.427959 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-var-lock\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.428408 master-0 kubenswrapper[7642]: I0319 11:54:50.428123 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-var-lock\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.428408 master-0 kubenswrapper[7642]: I0319 11:54:50.428368 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.450532 master-0 kubenswrapper[7642]: I0319 11:54:50.450135 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kube-api-access\") pod \"installer-3-master-0\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:50.586859 master-0 kubenswrapper[7642]: I0319 11:54:50.586776 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:54:51.713701 master-0 kubenswrapper[7642]: I0319 11:54:51.713573 7642 patch_prober.go:28] interesting pod/controller-manager-69dc5c9779-942tz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.38:8443/healthz\": dial tcp 10.128.0.38:8443: connect: connection refused" start-of-body= Mar 19 11:54:51.715842 master-0 kubenswrapper[7642]: I0319 11:54:51.713763 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" podUID="ac8349ad-5063-439f-9785-fa90df81836b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.38:8443/healthz\": dial tcp 10.128.0.38:8443: connect: connection refused" Mar 19 11:54:51.748205 master-0 kubenswrapper[7642]: I0319 11:54:51.747700 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 11:54:51.748661 master-0 kubenswrapper[7642]: I0319 11:54:51.748616 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.751688 master-0 kubenswrapper[7642]: I0319 11:54:51.751620 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 11:54:51.765274 master-0 kubenswrapper[7642]: I0319 11:54:51.765036 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 11:54:51.856014 master-0 kubenswrapper[7642]: I0319 11:54:51.855835 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-var-lock\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.856014 master-0 kubenswrapper[7642]: I0319 11:54:51.855898 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.856469 master-0 kubenswrapper[7642]: I0319 11:54:51.856050 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kube-api-access\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.957564 master-0 kubenswrapper[7642]: I0319 11:54:51.957498 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kube-api-access\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.957564 master-0 kubenswrapper[7642]: I0319 11:54:51.957564 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-var-lock\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.957857 master-0 kubenswrapper[7642]: I0319 11:54:51.957589 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.957857 master-0 kubenswrapper[7642]: I0319 11:54:51.957681 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.958013 master-0 kubenswrapper[7642]: I0319 11:54:51.957936 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-var-lock\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:51.977492 master-0 kubenswrapper[7642]: I0319 11:54:51.977339 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kube-api-access\") pod \"installer-1-master-0\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:52.074832 master-0 kubenswrapper[7642]: I0319 11:54:52.074762 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:54:53.174224 master-0 kubenswrapper[7642]: I0319 11:54:53.173832 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.277278 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-client-ca\") pod \"00d5b0b0-3f0c-4136-9801-631b282999a7\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.277360 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fpfl\" (UniqueName: \"kubernetes.io/projected/00d5b0b0-3f0c-4136-9801-631b282999a7-kube-api-access-8fpfl\") pod \"00d5b0b0-3f0c-4136-9801-631b282999a7\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.277398 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-config\") pod \"00d5b0b0-3f0c-4136-9801-631b282999a7\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.277823 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") pod \"00d5b0b0-3f0c-4136-9801-631b282999a7\" (UID: \"00d5b0b0-3f0c-4136-9801-631b282999a7\") " Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.278041 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-config" (OuterVolumeSpecName: "config") pod "00d5b0b0-3f0c-4136-9801-631b282999a7" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.278294 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.279466 master-0 kubenswrapper[7642]: I0319 11:54:53.279244 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "00d5b0b0-3f0c-4136-9801-631b282999a7" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:53.292564 master-0 kubenswrapper[7642]: I0319 11:54:53.292395 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d5b0b0-3f0c-4136-9801-631b282999a7-kube-api-access-8fpfl" (OuterVolumeSpecName: "kube-api-access-8fpfl") pod "00d5b0b0-3f0c-4136-9801-631b282999a7" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7"). InnerVolumeSpecName "kube-api-access-8fpfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:53.296554 master-0 kubenswrapper[7642]: I0319 11:54:53.296500 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "00d5b0b0-3f0c-4136-9801-631b282999a7" (UID: "00d5b0b0-3f0c-4136-9801-631b282999a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:53.379434 master-0 kubenswrapper[7642]: I0319 11:54:53.378967 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/00d5b0b0-3f0c-4136-9801-631b282999a7-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.379434 master-0 kubenswrapper[7642]: I0319 11:54:53.379331 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fpfl\" (UniqueName: \"kubernetes.io/projected/00d5b0b0-3f0c-4136-9801-631b282999a7-kube-api-access-8fpfl\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.379434 master-0 kubenswrapper[7642]: I0319 11:54:53.379342 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00d5b0b0-3f0c-4136-9801-631b282999a7-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.474668 master-0 kubenswrapper[7642]: I0319 11:54:53.474598 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4"] Mar 19 11:54:53.485254 master-0 kubenswrapper[7642]: I0319 11:54:53.485167 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g"] Mar 19 11:54:53.516858 master-0 kubenswrapper[7642]: I0319 11:54:53.515961 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:53.529577 master-0 kubenswrapper[7642]: I0319 11:54:53.528414 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:53.541522 master-0 kubenswrapper[7642]: I0319 11:54:53.538800 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 19 11:54:53.566926 master-0 kubenswrapper[7642]: I0319 11:54:53.566475 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:54:53.579027 master-0 kubenswrapper[7642]: I0319 11:54:53.578971 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c499876f-wcrpp"] Mar 19 11:54:53.583450 master-0 kubenswrapper[7642]: I0319 11:54:53.583195 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8349ad-5063-439f-9785-fa90df81836b-serving-cert\") pod \"ac8349ad-5063-439f-9785-fa90df81836b\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " Mar 19 11:54:53.583450 master-0 kubenswrapper[7642]: I0319 11:54:53.583243 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-config\") pod \"ac8349ad-5063-439f-9785-fa90df81836b\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " Mar 19 11:54:53.583450 master-0 kubenswrapper[7642]: I0319 11:54:53.583287 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-client-ca\") pod \"ac8349ad-5063-439f-9785-fa90df81836b\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " Mar 19 11:54:53.583450 master-0 kubenswrapper[7642]: I0319 11:54:53.583346 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b48j\" (UniqueName: \"kubernetes.io/projected/ac8349ad-5063-439f-9785-fa90df81836b-kube-api-access-6b48j\") pod \"ac8349ad-5063-439f-9785-fa90df81836b\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " Mar 19 11:54:53.584330 master-0 kubenswrapper[7642]: E0319 11:54:53.583562 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8349ad-5063-439f-9785-fa90df81836b" containerName="controller-manager" Mar 19 11:54:53.584330 master-0 kubenswrapper[7642]: I0319 11:54:53.583611 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8349ad-5063-439f-9785-fa90df81836b" containerName="controller-manager" Mar 19 11:54:53.584330 master-0 kubenswrapper[7642]: I0319 11:54:53.583750 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8349ad-5063-439f-9785-fa90df81836b" containerName="controller-manager" Mar 19 11:54:53.584330 master-0 kubenswrapper[7642]: I0319 11:54:53.584257 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.584980 master-0 kubenswrapper[7642]: I0319 11:54:53.584623 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-config" (OuterVolumeSpecName: "config") pod "ac8349ad-5063-439f-9785-fa90df81836b" (UID: "ac8349ad-5063-439f-9785-fa90df81836b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:53.584980 master-0 kubenswrapper[7642]: I0319 11:54:53.584611 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac8349ad-5063-439f-9785-fa90df81836b" (UID: "ac8349ad-5063-439f-9785-fa90df81836b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:53.588234 master-0 kubenswrapper[7642]: I0319 11:54:53.587774 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-proxy-ca-bundles\") pod \"ac8349ad-5063-439f-9785-fa90df81836b\" (UID: \"ac8349ad-5063-439f-9785-fa90df81836b\") " Mar 19 11:54:53.588234 master-0 kubenswrapper[7642]: I0319 11:54:53.588094 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.588234 master-0 kubenswrapper[7642]: I0319 11:54:53.588108 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.588234 master-0 kubenswrapper[7642]: I0319 11:54:53.588117 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c499876f-wcrpp"] Mar 19 11:54:53.606439 master-0 kubenswrapper[7642]: I0319 11:54:53.595345 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ac8349ad-5063-439f-9785-fa90df81836b" (UID: "ac8349ad-5063-439f-9785-fa90df81836b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:53.606439 master-0 kubenswrapper[7642]: I0319 11:54:53.598099 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8349ad-5063-439f-9785-fa90df81836b-kube-api-access-6b48j" (OuterVolumeSpecName: "kube-api-access-6b48j") pod "ac8349ad-5063-439f-9785-fa90df81836b" (UID: "ac8349ad-5063-439f-9785-fa90df81836b"). InnerVolumeSpecName "kube-api-access-6b48j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:53.608950 master-0 kubenswrapper[7642]: I0319 11:54:53.608474 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac8349ad-5063-439f-9785-fa90df81836b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac8349ad-5063-439f-9785-fa90df81836b" (UID: "ac8349ad-5063-439f-9785-fa90df81836b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:53.618028 master-0 kubenswrapper[7642]: I0319 11:54:53.617960 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:54:53.625703 master-0 kubenswrapper[7642]: I0319 11:54:53.625657 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 19 11:54:53.686400 master-0 kubenswrapper[7642]: I0319 11:54:53.685136 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" event={"ID":"00d5b0b0-3f0c-4136-9801-631b282999a7","Type":"ContainerDied","Data":"8223190fa0d9cc91deaaba666f0a19e363838137553ad3e78caf3ebe08515ebd"} Mar 19 11:54:53.686400 master-0 kubenswrapper[7642]: I0319 11:54:53.685571 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690386 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kjhn\" (UniqueName: \"kubernetes.io/projected/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-kube-api-access-9kjhn\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690448 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-config\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690485 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-serving-cert\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690504 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-proxy-ca-bundles\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690558 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-client-ca\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690596 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b48j\" (UniqueName: \"kubernetes.io/projected/ac8349ad-5063-439f-9785-fa90df81836b-kube-api-access-6b48j\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690607 7642 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac8349ad-5063-439f-9785-fa90df81836b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.692083 master-0 kubenswrapper[7642]: I0319 11:54:53.690617 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac8349ad-5063-439f-9785-fa90df81836b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:53.700982 master-0 kubenswrapper[7642]: I0319 11:54:53.694580 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerStarted","Data":"9d5a5757181aa7140885d0b2bdc37c14b8a849c385bb89cb00edfe5781ce2cdf"} Mar 19 11:54:53.701275 master-0 kubenswrapper[7642]: I0319 11:54:53.701202 7642 patch_prober.go:28] interesting pod/marketplace-operator-89ccd998f-4h6xn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Mar 19 11:54:53.701337 master-0 kubenswrapper[7642]: I0319 11:54:53.701266 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" podUID="39f479b3-45ff-43f9-ae85-083554a54918" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Mar 19 11:54:53.702051 master-0 kubenswrapper[7642]: I0319 11:54:53.701367 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:53.702051 master-0 kubenswrapper[7642]: I0319 11:54:53.701384 7642 generic.go:334] "Generic (PLEG): container finished" podID="ac8349ad-5063-439f-9785-fa90df81836b" containerID="10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a" exitCode=0 Mar 19 11:54:53.702051 master-0 kubenswrapper[7642]: I0319 11:54:53.701415 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" event={"ID":"ac8349ad-5063-439f-9785-fa90df81836b","Type":"ContainerDied","Data":"10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a"} Mar 19 11:54:53.702051 master-0 kubenswrapper[7642]: I0319 11:54:53.701439 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" event={"ID":"ac8349ad-5063-439f-9785-fa90df81836b","Type":"ContainerDied","Data":"a651f85f636acf31bd26274b37214b661d01e3f97dd8c21150a25f6945b01729"} Mar 19 11:54:53.702051 master-0 kubenswrapper[7642]: I0319 11:54:53.701479 7642 scope.go:117] "RemoveContainer" containerID="10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a" Mar 19 11:54:53.702051 master-0 kubenswrapper[7642]: I0319 11:54:53.701590 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69dc5c9779-942tz" Mar 19 11:54:53.710026 master-0 kubenswrapper[7642]: I0319 11:54:53.709955 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"074a3b28-725c-4b11-b445-c5cf6511ffc8","Type":"ContainerStarted","Data":"10197f9a2b8f4239079134b0edca7725f9141453319505d4b7744cc322d1e253"} Mar 19 11:54:53.724275 master-0 kubenswrapper[7642]: I0319 11:54:53.718866 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"5bea1d036c9faad13c1d14003487235e6d8ac60afd384e78842d83618960ce00"} Mar 19 11:54:53.733797 master-0 kubenswrapper[7642]: I0319 11:54:53.729366 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"cf537f49e2a1b4ab25a2740d2ec9131fff4692e5f665c069887b93bcb77ec276"} Mar 19 11:54:53.744283 master-0 kubenswrapper[7642]: I0319 11:54:53.744197 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"514e629483d9ce3a471489bf8ebd7e39de0e8072f7a78d8b93a7ef5b4897dfbc"} Mar 19 11:54:53.785383 master-0 kubenswrapper[7642]: I0319 11:54:53.778522 7642 scope.go:117] "RemoveContainer" containerID="10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a" Mar 19 11:54:53.785383 master-0 kubenswrapper[7642]: E0319 11:54:53.781436 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a\": container with ID starting with 10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a not found: ID does not exist" containerID="10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a" Mar 19 11:54:53.785383 master-0 kubenswrapper[7642]: I0319 11:54:53.781489 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a"} err="failed to get container status \"10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a\": rpc error: code = NotFound desc = could not find container \"10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a\": container with ID starting with 10547c553b1989ec9f9a1e366429143b7d07d7f064a1b64ba5a78d91c031913a not found: ID does not exist" Mar 19 11:54:53.785383 master-0 kubenswrapper[7642]: I0319 11:54:53.781833 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" event={"ID":"131e2573-ef74-4963-bdcf-901310e7a486","Type":"ContainerStarted","Data":"5e5f7fc3ebf0c67acc7c96961d4c2909a3186127871fb38ba8defa7ad1c517d5"} Mar 19 11:54:53.792028 master-0 kubenswrapper[7642]: I0319 11:54:53.791972 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kjhn\" (UniqueName: \"kubernetes.io/projected/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-kube-api-access-9kjhn\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.793011 master-0 kubenswrapper[7642]: I0319 11:54:53.792994 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-config\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.793225 master-0 kubenswrapper[7642]: I0319 11:54:53.793175 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69dc5c9779-942tz"] Mar 19 11:54:53.793283 master-0 kubenswrapper[7642]: I0319 11:54:53.793185 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-serving-cert\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.793366 master-0 kubenswrapper[7642]: I0319 11:54:53.793339 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-proxy-ca-bundles\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.800671 master-0 kubenswrapper[7642]: I0319 11:54:53.800602 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"01529a13-e1b1-40a1-9fff-8a20f1cba68c","Type":"ContainerStarted","Data":"9c2f6ee2f4f5a8055122de400fd9243193e98b0769b1970bf4b2bfb820983158"} Mar 19 11:54:53.802359 master-0 kubenswrapper[7642]: I0319 11:54:53.802313 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69dc5c9779-942tz"] Mar 19 11:54:53.811871 master-0 kubenswrapper[7642]: I0319 11:54:53.811562 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-config\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.812156 master-0 kubenswrapper[7642]: I0319 11:54:53.812081 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-client-ca\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.813704 master-0 kubenswrapper[7642]: I0319 11:54:53.813268 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-proxy-ca-bundles\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.814760 master-0 kubenswrapper[7642]: I0319 11:54:53.814647 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-client-ca\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.833110 master-0 kubenswrapper[7642]: I0319 11:54:53.832666 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-serving-cert\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.860471 master-0 kubenswrapper[7642]: I0319 11:54:53.853335 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75"] Mar 19 11:54:53.862697 master-0 kubenswrapper[7642]: I0319 11:54:53.860713 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kjhn\" (UniqueName: \"kubernetes.io/projected/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-kube-api-access-9kjhn\") pod \"controller-manager-67c499876f-wcrpp\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:53.865069 master-0 kubenswrapper[7642]: I0319 11:54:53.865038 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-849695ddd4-dgp75"] Mar 19 11:54:53.913071 master-0 kubenswrapper[7642]: I0319 11:54:53.912912 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:54.086691 master-0 kubenswrapper[7642]: I0319 11:54:54.085726 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d5b0b0-3f0c-4136-9801-631b282999a7" path="/var/lib/kubelet/pods/00d5b0b0-3f0c-4136-9801-631b282999a7/volumes" Mar 19 11:54:54.086691 master-0 kubenswrapper[7642]: I0319 11:54:54.086281 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8349ad-5063-439f-9785-fa90df81836b" path="/var/lib/kubelet/pods/ac8349ad-5063-439f-9785-fa90df81836b/volumes" Mar 19 11:54:54.702872 master-0 kubenswrapper[7642]: I0319 11:54:54.702048 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c499876f-wcrpp"] Mar 19 11:54:54.853484 master-0 kubenswrapper[7642]: I0319 11:54:54.852840 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-9w65f"] Mar 19 11:54:54.856735 master-0 kubenswrapper[7642]: I0319 11:54:54.854030 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:54.899117 master-0 kubenswrapper[7642]: I0319 11:54:54.899066 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" event={"ID":"6e8c4676-05af-4478-a44b-1bdcbefdea0f","Type":"ContainerStarted","Data":"457f9713239213c35b97b7cfbf842ed633e63353105428ab228603e6b8c79dd8"} Mar 19 11:54:54.900171 master-0 kubenswrapper[7642]: I0319 11:54:54.900119 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:54.952307 master-0 kubenswrapper[7642]: I0319 11:54:54.944606 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"e566658e07da148fd8797db81c134475cfca0b856d369e98428ade2c9af55e1c"} Mar 19 11:54:54.952307 master-0 kubenswrapper[7642]: I0319 11:54:54.945318 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 11:54:54.957339 master-0 kubenswrapper[7642]: I0319 11:54:54.956088 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" event={"ID":"5b46c6b4-f701-4a7f-bf74-90afe14ad89a","Type":"ContainerStarted","Data":"1e71a71430ecca74b1dd6f7eede647761f2774d3e5737d1b310c911f67fbd631"} Mar 19 11:54:54.957339 master-0 kubenswrapper[7642]: I0319 11:54:54.957270 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:54.959881 master-0 kubenswrapper[7642]: I0319 11:54:54.958910 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkd77" event={"ID":"3e1b71a7-c9fd-4f03-8d0c-f092dac80989","Type":"ContainerStarted","Data":"779dc5b5369bd95c8f8ad0170a9271f13ea8126f017abb36186ce64bcd027dfb"} Mar 19 11:54:54.989532 master-0 kubenswrapper[7642]: I0319 11:54:54.988749 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pgl92"] Mar 19 11:54:54.989948 master-0 kubenswrapper[7642]: I0319 11:54:54.989904 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:54.994844 master-0 kubenswrapper[7642]: I0319 11:54:54.993243 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 11:54:55.004997 master-0 kubenswrapper[7642]: I0319 11:54:55.004612 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 11:54:55.004997 master-0 kubenswrapper[7642]: I0319 11:54:55.004691 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 11:54:55.005211 master-0 kubenswrapper[7642]: I0319 11:54:55.004987 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 11:54:55.026456 master-0 kubenswrapper[7642]: I0319 11:54:55.018153 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 11:54:55.069180 master-0 kubenswrapper[7642]: I0319 11:54:55.068911 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" event={"ID":"33f6e54f-a750-4c91-b4f5-049275d33cfb","Type":"ContainerStarted","Data":"0ca8349da9d248f4a62f6d3f31584f3be56f6e92cb3e5edd7a03d771500af4eb"} Mar 19 11:54:55.069896 master-0 kubenswrapper[7642]: I0319 11:54:55.064523 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.070073 master-0 kubenswrapper[7642]: I0319 11:54:55.070041 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pgl92"] Mar 19 11:54:55.075184 master-0 kubenswrapper[7642]: I0319 11:54:55.074617 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-modprobe-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.075337 master-0 kubenswrapper[7642]: I0319 11:54:55.075281 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysconfig\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.075384 master-0 kubenswrapper[7642]: I0319 11:54:55.075372 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-sys\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.076047 master-0 kubenswrapper[7642]: I0319 11:54:55.075462 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-systemd\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.076047 master-0 kubenswrapper[7642]: I0319 11:54:55.075555 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-tuned\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.076047 master-0 kubenswrapper[7642]: I0319 11:54:55.075621 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-run\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.079385 master-0 kubenswrapper[7642]: I0319 11:54:55.077361 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-conf\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.079385 master-0 kubenswrapper[7642]: I0319 11:54:55.077457 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frf6\" (UniqueName: \"kubernetes.io/projected/07eef031-edcc-415c-91d7-f9f3bd0a45e3-kube-api-access-6frf6\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.079385 master-0 kubenswrapper[7642]: I0319 11:54:55.077549 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-var-lib-kubelet\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.082605 master-0 kubenswrapper[7642]: I0319 11:54:55.080151 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-lib-modules\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.082605 master-0 kubenswrapper[7642]: I0319 11:54:55.080285 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-host\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.082605 master-0 kubenswrapper[7642]: I0319 11:54:55.080324 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-tmp\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.082605 master-0 kubenswrapper[7642]: I0319 11:54:55.080433 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-kubernetes\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181287 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-host\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181342 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-tmp\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181361 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-kubernetes\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181385 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181406 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-modprobe-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181423 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysconfig\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181442 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-sys\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181459 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-systemd\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181519 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-tuned\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181536 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-run\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181553 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-conf\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181571 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frf6\" (UniqueName: \"kubernetes.io/projected/07eef031-edcc-415c-91d7-f9f3bd0a45e3-kube-api-access-6frf6\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181592 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-var-lib-kubelet\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181651 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b48q2\" (UniqueName: \"kubernetes.io/projected/a5cf6404-1640-43d3-8018-03c62c8338c5-kube-api-access-b48q2\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181678 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181695 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-lib-modules\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.182514 master-0 kubenswrapper[7642]: I0319 11:54:55.181711 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.183110 master-0 kubenswrapper[7642]: I0319 11:54:55.182991 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" event={"ID":"78d3b2df-3d81-413e-b039-80dc20111eae","Type":"ContainerStarted","Data":"7a11cd309bf0aadb171ec1091d9f6745f904977929ac03bf37c565cc447050f5"} Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.191320 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-lib-modules\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.191872 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-systemd\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.192224 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-host\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.192726 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-modprobe-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.192819 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-kubernetes\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.192852 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.193062 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-var-lib-kubelet\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.193114 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysconfig\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.193238 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-run\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.193272 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-sys\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.193355 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-conf\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.209559 master-0 kubenswrapper[7642]: I0319 11:54:55.209164 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 11:54:55.221072 master-0 kubenswrapper[7642]: I0319 11:54:55.220533 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-tuned\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.241271 master-0 kubenswrapper[7642]: I0319 11:54:55.236963 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-tmp\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.268949 master-0 kubenswrapper[7642]: I0319 11:54:55.262162 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32","Type":"ContainerStarted","Data":"81a2103b4ad6693138997ae75fcffbcb1e1a170a8ab8e624161ecf2a0c62868c"} Mar 19 11:54:55.274485 master-0 kubenswrapper[7642]: I0319 11:54:55.269904 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7tj9"] Mar 19 11:54:55.274485 master-0 kubenswrapper[7642]: I0319 11:54:55.271206 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.274485 master-0 kubenswrapper[7642]: I0319 11:54:55.273164 7642 generic.go:334] "Generic (PLEG): container finished" podID="03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" containerID="4187d473d418438867fce7ac79a071d2e915c228c96963406bda8b76aedeeb6c" exitCode=0 Mar 19 11:54:55.274485 master-0 kubenswrapper[7642]: I0319 11:54:55.273251 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" event={"ID":"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5","Type":"ContainerDied","Data":"4187d473d418438867fce7ac79a071d2e915c228c96963406bda8b76aedeeb6c"} Mar 19 11:54:55.290720 master-0 kubenswrapper[7642]: I0319 11:54:55.284184 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b48q2\" (UniqueName: \"kubernetes.io/projected/a5cf6404-1640-43d3-8018-03c62c8338c5-kube-api-access-b48q2\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.290720 master-0 kubenswrapper[7642]: I0319 11:54:55.284240 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.290720 master-0 kubenswrapper[7642]: I0319 11:54:55.284264 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.290720 master-0 kubenswrapper[7642]: I0319 11:54:55.288358 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.300326 master-0 kubenswrapper[7642]: I0319 11:54:55.300266 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.316049 master-0 kubenswrapper[7642]: I0319 11:54:55.303830 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"eb4c5b33-58c8-4b51-b71b-1befc165f2b1","Type":"ContainerStarted","Data":"4d0a460410b93716bcb9bd13e502416121bea8d7a0c8c686d890ff048f639230"} Mar 19 11:54:55.316049 master-0 kubenswrapper[7642]: I0319 11:54:55.307744 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7tj9"] Mar 19 11:54:55.323786 master-0 kubenswrapper[7642]: I0319 11:54:55.317857 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frf6\" (UniqueName: \"kubernetes.io/projected/07eef031-edcc-415c-91d7-f9f3bd0a45e3-kube-api-access-6frf6\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.348443 master-0 kubenswrapper[7642]: I0319 11:54:55.336137 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b48q2\" (UniqueName: \"kubernetes.io/projected/a5cf6404-1640-43d3-8018-03c62c8338c5-kube-api-access-b48q2\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.358111 master-0 kubenswrapper[7642]: I0319 11:54:55.358017 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"fbe4d38a848a8f8c8ac70c6aaafe195a243cd98f3045f416707deb69b549d12e"} Mar 19 11:54:55.368185 master-0 kubenswrapper[7642]: I0319 11:54:55.368097 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pgl92" Mar 19 11:54:55.397896 master-0 kubenswrapper[7642]: I0319 11:54:55.396927 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcbb\" (UniqueName: \"kubernetes.io/projected/6191b9b7-8e43-4006-ab25-734c23d34cf6-kube-api-access-jkcbb\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.400577 master-0 kubenswrapper[7642]: I0319 11:54:55.398398 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-utilities\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.400577 master-0 kubenswrapper[7642]: I0319 11:54:55.398503 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-catalog-content\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.417662 master-0 kubenswrapper[7642]: I0319 11:54:55.412287 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" event={"ID":"60f846d9-1b3d-4b9d-b3e0-90a238e02640","Type":"ContainerStarted","Data":"83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c"} Mar 19 11:54:55.425662 master-0 kubenswrapper[7642]: I0319 11:54:55.421334 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"074a3b28-725c-4b11-b445-c5cf6511ffc8","Type":"ContainerStarted","Data":"d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d"} Mar 19 11:54:55.425662 master-0 kubenswrapper[7642]: I0319 11:54:55.421591 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="074a3b28-725c-4b11-b445-c5cf6511ffc8" containerName="installer" containerID="cri-o://d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d" gracePeriod=30 Mar 19 11:54:55.448659 master-0 kubenswrapper[7642]: I0319 11:54:55.445009 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"a33c34d1a46884ff2764e0114249d4b1bcbad80e05aa9b1072bd5b5574041486"} Mar 19 11:54:55.478678 master-0 kubenswrapper[7642]: I0319 11:54:55.476292 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=4.476263536 podStartE2EDuration="4.476263536s" podCreationTimestamp="2026-03-19 11:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:55.47605078 +0000 UTC m=+53.593491567" watchObservedRunningTime="2026-03-19 11:54:55.476263536 +0000 UTC m=+53.593704323" Mar 19 11:54:55.489586 master-0 kubenswrapper[7642]: I0319 11:54:55.489317 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 11:54:55.510955 master-0 kubenswrapper[7642]: I0319 11:54:55.509745 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-utilities\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.510955 master-0 kubenswrapper[7642]: I0319 11:54:55.509787 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-catalog-content\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.510955 master-0 kubenswrapper[7642]: I0319 11:54:55.509861 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkcbb\" (UniqueName: \"kubernetes.io/projected/6191b9b7-8e43-4006-ab25-734c23d34cf6-kube-api-access-jkcbb\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.515385 master-0 kubenswrapper[7642]: I0319 11:54:55.514221 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-catalog-content\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.516228 master-0 kubenswrapper[7642]: I0319 11:54:55.515663 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-utilities\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.548095 master-0 kubenswrapper[7642]: I0319 11:54:55.544179 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"751316a7-cd5f-4787-b24f-f1cddeb6a29e","Type":"ContainerStarted","Data":"d19e569b3b31b135d6d3279e4bc831adfb2a118c30cf3280473097e65fd37705"} Mar 19 11:54:55.548095 master-0 kubenswrapper[7642]: I0319 11:54:55.544228 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"751316a7-cd5f-4787-b24f-f1cddeb6a29e","Type":"ContainerStarted","Data":"8dc3df7d917e038d3431655b50e062ea05b3b2442a7a61484be684526e4eb3a5"} Mar 19 11:54:55.560438 master-0 kubenswrapper[7642]: I0319 11:54:55.559738 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=14.559701766 podStartE2EDuration="14.559701766s" podCreationTimestamp="2026-03-19 11:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:55.543548915 +0000 UTC m=+53.660989702" watchObservedRunningTime="2026-03-19 11:54:55.559701766 +0000 UTC m=+53.677142553" Mar 19 11:54:55.564529 master-0 kubenswrapper[7642]: I0319 11:54:55.563352 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkcbb\" (UniqueName: \"kubernetes.io/projected/6191b9b7-8e43-4006-ab25-734c23d34cf6-kube-api-access-jkcbb\") pod \"redhat-operators-z7tj9\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.567799 master-0 kubenswrapper[7642]: I0319 11:54:55.567752 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"8085ddcc855855663d2f5770a76d3c79d6914a60f257f2223e1616226984fe66"} Mar 19 11:54:55.623202 master-0 kubenswrapper[7642]: I0319 11:54:55.619213 7642 generic.go:334] "Generic (PLEG): container finished" podID="7af28c70-3ee0-438b-8655-95bc57deaa05" containerID="49a4f50ee477671b0b5a448101a3e5ff8df04706fa3923595f4f02465bdb07e9" exitCode=0 Mar 19 11:54:55.623202 master-0 kubenswrapper[7642]: I0319 11:54:55.619335 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerDied","Data":"49a4f50ee477671b0b5a448101a3e5ff8df04706fa3923595f4f02465bdb07e9"} Mar 19 11:54:55.628742 master-0 kubenswrapper[7642]: I0319 11:54:55.627312 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-j2kp7"] Mar 19 11:54:55.628742 master-0 kubenswrapper[7642]: I0319 11:54:55.628080 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.678484 master-0 kubenswrapper[7642]: I0319 11:54:55.674007 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" event={"ID":"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9","Type":"ContainerStarted","Data":"294c8e92aed8170867fe457688347ef2302a1dc291b0d68c0ca5a9868628adf8"} Mar 19 11:54:55.678484 master-0 kubenswrapper[7642]: I0319 11:54:55.674882 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:55.685849 master-0 kubenswrapper[7642]: W0319 11:54:55.685798 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07eef031_edcc_415c_91d7_f9f3bd0a45e3.slice/crio-b09ba92792d71b97e085101b3f5f0d26504a7ce27d6d2f696d8e3319844365b2 WatchSource:0}: Error finding container b09ba92792d71b97e085101b3f5f0d26504a7ce27d6d2f696d8e3319844365b2: Status 404 returned error can't find the container with id b09ba92792d71b97e085101b3f5f0d26504a7ce27d6d2f696d8e3319844365b2 Mar 19 11:54:55.693052 master-0 kubenswrapper[7642]: I0319 11:54:55.688022 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:55.695170 master-0 kubenswrapper[7642]: I0319 11:54:55.694024 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:54:55.701193 master-0 kubenswrapper[7642]: I0319 11:54:55.698715 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" event={"ID":"5ae2d717-506b-4d8e-803b-689c1cc3557c","Type":"ContainerStarted","Data":"7df327773d76aaacf805979906c8e4aa2f22ce6b1887fb5d42df0fd26e17a8b7"} Mar 19 11:54:55.701193 master-0 kubenswrapper[7642]: I0319 11:54:55.700446 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=5.700428846 podStartE2EDuration="5.700428846s" podCreationTimestamp="2026-03-19 11:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:55.648041993 +0000 UTC m=+53.765482770" watchObservedRunningTime="2026-03-19 11:54:55.700428846 +0000 UTC m=+53.817869633" Mar 19 11:54:55.764256 master-0 kubenswrapper[7642]: I0319 11:54:55.757401 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821291 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-trusted-ca-bundle\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821354 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-encryption-config\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821389 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-policies\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821417 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-serving-ca\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821454 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-dir\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821490 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821538 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-client\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.821583 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5qnw\" (UniqueName: \"kubernetes.io/projected/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-kube-api-access-s5qnw\") pod \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\" (UID: \"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5\") " Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.822045 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.822790 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.823140 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9zjb\" (UniqueName: \"kubernetes.io/projected/f5148a50-4332-4ab4-af63-449ef7f16333-kube-api-access-p9zjb\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.823227 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.823703 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5148a50-4332-4ab4-af63-449ef7f16333-hosts-file\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.824198 7642 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.824227 7642 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.824238 7642 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.824678 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.833411 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-kube-api-access-s5qnw" (OuterVolumeSpecName: "kube-api-access-s5qnw") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "kube-api-access-s5qnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:55.839234 master-0 kubenswrapper[7642]: I0319 11:54:55.833848 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:55.844766 master-0 kubenswrapper[7642]: I0319 11:54:55.842556 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:55.862953 master-0 kubenswrapper[7642]: I0319 11:54:55.846987 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" (UID: "03c3e2ba-8810-48f9-9c77-7597cc2ea6c5"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:54:55.917429 master-0 kubenswrapper[7642]: I0319 11:54:55.916525 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r"] Mar 19 11:54:55.917429 master-0 kubenswrapper[7642]: E0319 11:54:55.916759 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" containerName="fix-audit-permissions" Mar 19 11:54:55.917429 master-0 kubenswrapper[7642]: I0319 11:54:55.916773 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" containerName="fix-audit-permissions" Mar 19 11:54:55.917429 master-0 kubenswrapper[7642]: I0319 11:54:55.916854 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" containerName="fix-audit-permissions" Mar 19 11:54:55.917429 master-0 kubenswrapper[7642]: I0319 11:54:55.917149 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r"] Mar 19 11:54:55.917429 master-0 kubenswrapper[7642]: I0319 11:54:55.917235 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:55.924579 master-0 kubenswrapper[7642]: I0319 11:54:55.921102 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:54:55.924579 master-0 kubenswrapper[7642]: I0319 11:54:55.921258 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:54:55.924579 master-0 kubenswrapper[7642]: I0319 11:54:55.921385 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:54:55.924579 master-0 kubenswrapper[7642]: I0319 11:54:55.921991 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:54:55.924579 master-0 kubenswrapper[7642]: I0319 11:54:55.922239 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927398 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5148a50-4332-4ab4-af63-449ef7f16333-hosts-file\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927487 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9zjb\" (UniqueName: \"kubernetes.io/projected/f5148a50-4332-4ab4-af63-449ef7f16333-kube-api-access-p9zjb\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927532 7642 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927545 7642 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927556 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927567 7642 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.927579 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5qnw\" (UniqueName: \"kubernetes.io/projected/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5-kube-api-access-s5qnw\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:55.951781 master-0 kubenswrapper[7642]: I0319 11:54:55.928123 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5148a50-4332-4ab4-af63-449ef7f16333-hosts-file\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.989731 master-0 kubenswrapper[7642]: I0319 11:54:55.971090 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9zjb\" (UniqueName: \"kubernetes.io/projected/f5148a50-4332-4ab4-af63-449ef7f16333-kube-api-access-p9zjb\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:55.989731 master-0 kubenswrapper[7642]: I0319 11:54:55.985699 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pgl92"] Mar 19 11:54:56.037331 master-0 kubenswrapper[7642]: I0319 11:54:56.035694 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tp448"] Mar 19 11:54:56.037331 master-0 kubenswrapper[7642]: I0319 11:54:56.035849 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-config\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.037331 master-0 kubenswrapper[7642]: I0319 11:54:56.035894 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-client-ca\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.037331 master-0 kubenswrapper[7642]: I0319 11:54:56.035929 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrk8\" (UniqueName: \"kubernetes.io/projected/7c54300b-f091-488f-b7e6-0fc27d0453e6-kube-api-access-qtrk8\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.037331 master-0 kubenswrapper[7642]: I0319 11:54:56.035984 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c54300b-f091-488f-b7e6-0fc27d0453e6-serving-cert\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.039681 master-0 kubenswrapper[7642]: I0319 11:54:56.037952 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.074106 master-0 kubenswrapper[7642]: I0319 11:54:56.069406 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" podStartSLOduration=9.069375429 podStartE2EDuration="9.069375429s" podCreationTimestamp="2026-03-19 11:54:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:56.056926682 +0000 UTC m=+54.174367489" watchObservedRunningTime="2026-03-19 11:54:56.069375429 +0000 UTC m=+54.186816216" Mar 19 11:54:56.091777 master-0 kubenswrapper[7642]: I0319 11:54:56.088242 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tp448"] Mar 19 11:54:56.106906 master-0 kubenswrapper[7642]: I0319 11:54:56.105977 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j2kp7" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150192 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjb78\" (UniqueName: \"kubernetes.io/projected/42edeefc-c920-4cd7-88b0-8c1e660abf63-kube-api-access-vjb78\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150260 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-config\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150289 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-client-ca\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150318 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrk8\" (UniqueName: \"kubernetes.io/projected/7c54300b-f091-488f-b7e6-0fc27d0453e6-kube-api-access-qtrk8\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150354 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c54300b-f091-488f-b7e6-0fc27d0453e6-serving-cert\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150388 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-utilities\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.151075 master-0 kubenswrapper[7642]: I0319 11:54:56.150411 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-catalog-content\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.155929 master-0 kubenswrapper[7642]: I0319 11:54:56.154874 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-config\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.155929 master-0 kubenswrapper[7642]: I0319 11:54:56.155441 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_074a3b28-725c-4b11-b445-c5cf6511ffc8/installer/0.log" Mar 19 11:54:56.155929 master-0 kubenswrapper[7642]: I0319 11:54:56.155522 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:56.161156 master-0 kubenswrapper[7642]: I0319 11:54:56.161078 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-client-ca\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.168568 master-0 kubenswrapper[7642]: I0319 11:54:56.168381 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c54300b-f091-488f-b7e6-0fc27d0453e6-serving-cert\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.219867 master-0 kubenswrapper[7642]: I0319 11:54:56.219018 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrk8\" (UniqueName: \"kubernetes.io/projected/7c54300b-f091-488f-b7e6-0fc27d0453e6-kube-api-access-qtrk8\") pod \"route-controller-manager-6f69766c59-b588r\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.263539 master-0 kubenswrapper[7642]: I0319 11:54:56.263312 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-var-lock\") pod \"074a3b28-725c-4b11-b445-c5cf6511ffc8\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " Mar 19 11:54:56.263539 master-0 kubenswrapper[7642]: I0319 11:54:56.263457 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074a3b28-725c-4b11-b445-c5cf6511ffc8-kube-api-access\") pod \"074a3b28-725c-4b11-b445-c5cf6511ffc8\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " Mar 19 11:54:56.263539 master-0 kubenswrapper[7642]: I0319 11:54:56.263501 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-kubelet-dir\") pod \"074a3b28-725c-4b11-b445-c5cf6511ffc8\" (UID: \"074a3b28-725c-4b11-b445-c5cf6511ffc8\") " Mar 19 11:54:56.263843 master-0 kubenswrapper[7642]: I0319 11:54:56.263687 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-utilities\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.263843 master-0 kubenswrapper[7642]: I0319 11:54:56.263718 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-catalog-content\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.263843 master-0 kubenswrapper[7642]: I0319 11:54:56.263775 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjb78\" (UniqueName: \"kubernetes.io/projected/42edeefc-c920-4cd7-88b0-8c1e660abf63-kube-api-access-vjb78\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.264819 master-0 kubenswrapper[7642]: I0319 11:54:56.264053 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "074a3b28-725c-4b11-b445-c5cf6511ffc8" (UID: "074a3b28-725c-4b11-b445-c5cf6511ffc8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:56.264819 master-0 kubenswrapper[7642]: I0319 11:54:56.264131 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-var-lock" (OuterVolumeSpecName: "var-lock") pod "074a3b28-725c-4b11-b445-c5cf6511ffc8" (UID: "074a3b28-725c-4b11-b445-c5cf6511ffc8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:54:56.266152 master-0 kubenswrapper[7642]: I0319 11:54:56.266103 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-catalog-content\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.271697 master-0 kubenswrapper[7642]: I0319 11:54:56.271121 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-utilities\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.278304 master-0 kubenswrapper[7642]: I0319 11:54:56.278201 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074a3b28-725c-4b11-b445-c5cf6511ffc8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "074a3b28-725c-4b11-b445-c5cf6511ffc8" (UID: "074a3b28-725c-4b11-b445-c5cf6511ffc8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:54:56.336880 master-0 kubenswrapper[7642]: I0319 11:54:56.336047 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:54:56.354111 master-0 kubenswrapper[7642]: I0319 11:54:56.354046 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjb78\" (UniqueName: \"kubernetes.io/projected/42edeefc-c920-4cd7-88b0-8c1e660abf63-kube-api-access-vjb78\") pod \"certified-operators-tp448\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.362581 master-0 kubenswrapper[7642]: I0319 11:54:56.362507 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7tj9"] Mar 19 11:54:56.367084 master-0 kubenswrapper[7642]: I0319 11:54:56.365399 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/074a3b28-725c-4b11-b445-c5cf6511ffc8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:56.367084 master-0 kubenswrapper[7642]: I0319 11:54:56.365425 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:56.367084 master-0 kubenswrapper[7642]: I0319 11:54:56.365436 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/074a3b28-725c-4b11-b445-c5cf6511ffc8-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:54:56.430908 master-0 kubenswrapper[7642]: I0319 11:54:56.429488 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:54:56.733207 master-0 kubenswrapper[7642]: I0319 11:54:56.731888 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r"] Mar 19 11:54:56.733430 master-0 kubenswrapper[7642]: I0319 11:54:56.733229 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerStarted","Data":"5e0c0c445c254e95b1792fac91e1a62fba7b29c5a016186d5923ee9ea180aac4"} Mar 19 11:54:56.737157 master-0 kubenswrapper[7642]: I0319 11:54:56.737092 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"cddea9ae351cdfdd1985ab5fdb11762d077c83cdb329eae761816bc9c824a2a1"} Mar 19 11:54:56.738017 master-0 kubenswrapper[7642]: I0319 11:54:56.737797 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:54:56.747201 master-0 kubenswrapper[7642]: I0319 11:54:56.747131 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"eb4c5b33-58c8-4b51-b71b-1befc165f2b1","Type":"ContainerStarted","Data":"bff053145b8a60a46c397e36d3fff2a19f60020e8f6b7b60fcc48ee33442df16"} Mar 19 11:54:56.757744 master-0 kubenswrapper[7642]: I0319 11:54:56.757609 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32","Type":"ContainerStarted","Data":"52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841"} Mar 19 11:54:56.763007 master-0 kubenswrapper[7642]: I0319 11:54:56.762935 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" event={"ID":"07eef031-edcc-415c-91d7-f9f3bd0a45e3","Type":"ContainerStarted","Data":"4e8446e16d9a3735605f6f6ec51d20f491b3eb101b6c3fd3ca5dfd89335292d1"} Mar 19 11:54:56.763184 master-0 kubenswrapper[7642]: I0319 11:54:56.763018 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" event={"ID":"07eef031-edcc-415c-91d7-f9f3bd0a45e3","Type":"ContainerStarted","Data":"b09ba92792d71b97e085101b3f5f0d26504a7ce27d6d2f696d8e3319844365b2"} Mar 19 11:54:56.765217 master-0 kubenswrapper[7642]: I0319 11:54:56.764724 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j2kp7" event={"ID":"f5148a50-4332-4ab4-af63-449ef7f16333","Type":"ContainerStarted","Data":"27bfc2e7edbcc6c9da4d8e8c7defa74d634ff9325db83c99710558275c0105da"} Mar 19 11:54:56.771281 master-0 kubenswrapper[7642]: I0319 11:54:56.769399 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_074a3b28-725c-4b11-b445-c5cf6511ffc8/installer/0.log" Mar 19 11:54:56.771281 master-0 kubenswrapper[7642]: I0319 11:54:56.769452 7642 generic.go:334] "Generic (PLEG): container finished" podID="074a3b28-725c-4b11-b445-c5cf6511ffc8" containerID="d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d" exitCode=2 Mar 19 11:54:56.771281 master-0 kubenswrapper[7642]: I0319 11:54:56.769575 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 19 11:54:56.771281 master-0 kubenswrapper[7642]: I0319 11:54:56.770710 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"074a3b28-725c-4b11-b445-c5cf6511ffc8","Type":"ContainerDied","Data":"d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d"} Mar 19 11:54:56.771281 master-0 kubenswrapper[7642]: I0319 11:54:56.770754 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"074a3b28-725c-4b11-b445-c5cf6511ffc8","Type":"ContainerDied","Data":"10197f9a2b8f4239079134b0edca7725f9141453319505d4b7744cc322d1e253"} Mar 19 11:54:56.771281 master-0 kubenswrapper[7642]: I0319 11:54:56.770777 7642 scope.go:117] "RemoveContainer" containerID="d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d" Mar 19 11:54:56.791889 master-0 kubenswrapper[7642]: I0319 11:54:56.791835 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" event={"ID":"5ae2d717-506b-4d8e-803b-689c1cc3557c","Type":"ContainerStarted","Data":"a6e72fa780304dc2b33dbdfd17546cbc3d37632700728a439fd482435edf60d1"} Mar 19 11:54:56.793359 master-0 kubenswrapper[7642]: I0319 11:54:56.793332 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pgl92" event={"ID":"a5cf6404-1640-43d3-8018-03c62c8338c5","Type":"ContainerStarted","Data":"a77515f16c17046f407c9f07e5609964e9814a68e8f45a75bb351406b4dcfc20"} Mar 19 11:54:56.801379 master-0 kubenswrapper[7642]: I0319 11:54:56.798712 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" podStartSLOduration=12.798698926 podStartE2EDuration="12.798698926s" podCreationTimestamp="2026-03-19 11:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:56.798048898 +0000 UTC m=+54.915489685" watchObservedRunningTime="2026-03-19 11:54:56.798698926 +0000 UTC m=+54.916139713" Mar 19 11:54:56.806985 master-0 kubenswrapper[7642]: I0319 11:54:56.806717 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" event={"ID":"33f6e54f-a750-4c91-b4f5-049275d33cfb","Type":"ContainerStarted","Data":"a600bd8f0138af8f048f9cf6ac0b29d4aabccd1b0f117b61a0f3f451ef5ef8af"} Mar 19 11:54:56.814655 master-0 kubenswrapper[7642]: I0319 11:54:56.814036 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkd77" event={"ID":"3e1b71a7-c9fd-4f03-8d0c-f092dac80989","Type":"ContainerStarted","Data":"a526f0ca6aa3cc2a180780d0441d60c91cb569e7abaf7f74b844db9d23815ca8"} Mar 19 11:54:56.872012 master-0 kubenswrapper[7642]: I0319 11:54:56.870233 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"0e52d93d005b0e420b3ffb2e9181ac41115554d0386bf658491e57c893340a8e"} Mar 19 11:54:56.884026 master-0 kubenswrapper[7642]: I0319 11:54:56.875848 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tp448"] Mar 19 11:54:56.884026 master-0 kubenswrapper[7642]: I0319 11:54:56.880285 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" podStartSLOduration=2.8802517119999997 podStartE2EDuration="2.880251712s" podCreationTimestamp="2026-03-19 11:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:56.853758082 +0000 UTC m=+54.971198889" watchObservedRunningTime="2026-03-19 11:54:56.880251712 +0000 UTC m=+54.997692499" Mar 19 11:54:56.885118 master-0 kubenswrapper[7642]: I0319 11:54:56.885058 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" event={"ID":"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9","Type":"ContainerStarted","Data":"5d549fc56408d3e81bd303e1acc79906265312e8b58b030cd5bf088ae404228a"} Mar 19 11:54:56.901438 master-0 kubenswrapper[7642]: I0319 11:54:56.901293 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"01529a13-e1b1-40a1-9fff-8a20f1cba68c","Type":"ContainerStarted","Data":"825551a6cbd6e2d9a62ce681600e2897018550eae2ad6b2a4c579d2b878da6e9"} Mar 19 11:54:56.906973 master-0 kubenswrapper[7642]: I0319 11:54:56.906905 7642 scope.go:117] "RemoveContainer" containerID="d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d" Mar 19 11:54:56.908480 master-0 kubenswrapper[7642]: I0319 11:54:56.908384 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=13.908360747 podStartE2EDuration="13.908360747s" podCreationTimestamp="2026-03-19 11:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:56.902349989 +0000 UTC m=+55.019790796" watchObservedRunningTime="2026-03-19 11:54:56.908360747 +0000 UTC m=+55.025801534" Mar 19 11:54:56.944553 master-0 kubenswrapper[7642]: E0319 11:54:56.944489 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d\": container with ID starting with d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d not found: ID does not exist" containerID="d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d" Mar 19 11:54:56.947745 master-0 kubenswrapper[7642]: I0319 11:54:56.947061 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d"} err="failed to get container status \"d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d\": rpc error: code = NotFound desc = could not find container \"d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d\": container with ID starting with d5891186b551c7429cd9d8b779c27587b341b41cbe56fa0c6944abfe9e50225d not found: ID does not exist" Mar 19 11:54:56.948147 master-0 kubenswrapper[7642]: I0319 11:54:56.948003 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7tj9" event={"ID":"6191b9b7-8e43-4006-ab25-734c23d34cf6","Type":"ContainerStarted","Data":"49e6cb3af36a0772637e2b879d634836b906c3780dc54ff6831288e9854b02fe"} Mar 19 11:54:56.957724 master-0 kubenswrapper[7642]: I0319 11:54:56.951485 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"3d22a1705e322b07bf63dc93baa238f057583ee120dd5b2d2d08b50e9d7a1b09"} Mar 19 11:54:56.957724 master-0 kubenswrapper[7642]: I0319 11:54:56.951698 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:54:56.958877 master-0 kubenswrapper[7642]: I0319 11:54:56.958442 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:56.959482 master-0 kubenswrapper[7642]: I0319 11:54:56.959434 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 19 11:54:56.962956 master-0 kubenswrapper[7642]: I0319 11:54:56.962898 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" event={"ID":"03c3e2ba-8810-48f9-9c77-7597cc2ea6c5","Type":"ContainerDied","Data":"0256e346ceb50747334fd7b0822c0dd8b7fa9973de10430918b7fc4f8df2e700"} Mar 19 11:54:56.963142 master-0 kubenswrapper[7642]: I0319 11:54:56.963126 7642 scope.go:117] "RemoveContainer" containerID="4187d473d418438867fce7ac79a071d2e915c228c96963406bda8b76aedeeb6c" Mar 19 11:54:56.963394 master-0 kubenswrapper[7642]: I0319 11:54:56.963325 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5547669f67-7t9wr" Mar 19 11:54:56.985574 master-0 kubenswrapper[7642]: I0319 11:54:56.985498 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=14.98546687 podStartE2EDuration="14.98546687s" podCreationTimestamp="2026-03-19 11:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:56.983731412 +0000 UTC m=+55.101172199" watchObservedRunningTime="2026-03-19 11:54:56.98546687 +0000 UTC m=+55.102907657" Mar 19 11:54:57.115812 master-0 kubenswrapper[7642]: I0319 11:54:57.115725 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" podStartSLOduration=13.115705317 podStartE2EDuration="13.115705317s" podCreationTimestamp="2026-03-19 11:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:57.114509694 +0000 UTC m=+55.231950491" watchObservedRunningTime="2026-03-19 11:54:57.115705317 +0000 UTC m=+55.233146104" Mar 19 11:54:57.190096 master-0 kubenswrapper[7642]: I0319 11:54:57.190002 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-7t9wr"] Mar 19 11:54:57.218780 master-0 kubenswrapper[7642]: I0319 11:54:57.210284 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-5547669f67-7t9wr"] Mar 19 11:54:57.409609 master-0 kubenswrapper[7642]: I0319 11:54:57.409212 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vj6x4"] Mar 19 11:54:57.410182 master-0 kubenswrapper[7642]: E0319 11:54:57.410168 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074a3b28-725c-4b11-b445-c5cf6511ffc8" containerName="installer" Mar 19 11:54:57.410276 master-0 kubenswrapper[7642]: I0319 11:54:57.410263 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="074a3b28-725c-4b11-b445-c5cf6511ffc8" containerName="installer" Mar 19 11:54:57.410471 master-0 kubenswrapper[7642]: I0319 11:54:57.410459 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="074a3b28-725c-4b11-b445-c5cf6511ffc8" containerName="installer" Mar 19 11:54:57.411791 master-0 kubenswrapper[7642]: I0319 11:54:57.411773 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.426836 master-0 kubenswrapper[7642]: I0319 11:54:57.425752 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vj6x4"] Mar 19 11:54:57.437210 master-0 kubenswrapper[7642]: I0319 11:54:57.436976 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-catalog-content\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.437210 master-0 kubenswrapper[7642]: I0319 11:54:57.437068 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbzg\" (UniqueName: \"kubernetes.io/projected/aedae669-04fb-44e7-afac-03bd0bfae4e9-kube-api-access-mdbzg\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.437210 master-0 kubenswrapper[7642]: I0319 11:54:57.437102 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-utilities\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.538362 master-0 kubenswrapper[7642]: I0319 11:54:57.538208 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-catalog-content\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.538362 master-0 kubenswrapper[7642]: I0319 11:54:57.538366 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbzg\" (UniqueName: \"kubernetes.io/projected/aedae669-04fb-44e7-afac-03bd0bfae4e9-kube-api-access-mdbzg\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.538816 master-0 kubenswrapper[7642]: I0319 11:54:57.538402 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-utilities\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.539066 master-0 kubenswrapper[7642]: I0319 11:54:57.539038 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-utilities\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.539615 master-0 kubenswrapper[7642]: I0319 11:54:57.539402 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-catalog-content\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.567692 master-0 kubenswrapper[7642]: I0319 11:54:57.566450 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbzg\" (UniqueName: \"kubernetes.io/projected/aedae669-04fb-44e7-afac-03bd0bfae4e9-kube-api-access-mdbzg\") pod \"community-operators-vj6x4\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.741422 master-0 kubenswrapper[7642]: I0319 11:54:57.740855 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:54:57.970378 master-0 kubenswrapper[7642]: I0319 11:54:57.970305 7642 generic.go:334] "Generic (PLEG): container finished" podID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerID="bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac" exitCode=0 Mar 19 11:54:57.972499 master-0 kubenswrapper[7642]: I0319 11:54:57.970447 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7tj9" event={"ID":"6191b9b7-8e43-4006-ab25-734c23d34cf6","Type":"ContainerDied","Data":"bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac"} Mar 19 11:54:57.974690 master-0 kubenswrapper[7642]: I0319 11:54:57.974357 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" event={"ID":"7c54300b-f091-488f-b7e6-0fc27d0453e6","Type":"ContainerStarted","Data":"dfc938e0e6a4a3e0f7d0726d217078fc3d8cea145d7766d9bcc12fedb3b9c430"} Mar 19 11:54:57.976465 master-0 kubenswrapper[7642]: I0319 11:54:57.976436 7642 generic.go:334] "Generic (PLEG): container finished" podID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerID="069b17cffc835d0f5bf79176129e2a69147a9ed67461773630bacc9148088890" exitCode=0 Mar 19 11:54:57.976561 master-0 kubenswrapper[7642]: I0319 11:54:57.976482 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tp448" event={"ID":"42edeefc-c920-4cd7-88b0-8c1e660abf63","Type":"ContainerDied","Data":"069b17cffc835d0f5bf79176129e2a69147a9ed67461773630bacc9148088890"} Mar 19 11:54:57.976561 master-0 kubenswrapper[7642]: I0319 11:54:57.976501 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tp448" event={"ID":"42edeefc-c920-4cd7-88b0-8c1e660abf63","Type":"ContainerStarted","Data":"2a26373cf55f1a392f387b21b50d4f4a77a1dbdf1a7c1014bb10e807d1209774"} Mar 19 11:54:57.984065 master-0 kubenswrapper[7642]: I0319 11:54:57.984010 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j2kp7" event={"ID":"f5148a50-4332-4ab4-af63-449ef7f16333","Type":"ContainerStarted","Data":"d3da346d2a31113c42b29b7bafb0dcf087a712bb2d1d78530c61e5aaded3b8b1"} Mar 19 11:54:57.989755 master-0 kubenswrapper[7642]: I0319 11:54:57.989701 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerStarted","Data":"d60d501384cd053df7d86d6b632c3daf437c4d9f2ef724d350e9c0d1c254c4aa"} Mar 19 11:54:58.034787 master-0 kubenswrapper[7642]: I0319 11:54:58.034508 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vj6x4"] Mar 19 11:54:58.130664 master-0 kubenswrapper[7642]: W0319 11:54:58.128544 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaedae669_04fb_44e7_afac_03bd0bfae4e9.slice/crio-df5f164a29ed9b0b229f66f5108c1194120dcaed538131f0457ba31bf5557108 WatchSource:0}: Error finding container df5f164a29ed9b0b229f66f5108c1194120dcaed538131f0457ba31bf5557108: Status 404 returned error can't find the container with id df5f164a29ed9b0b229f66f5108c1194120dcaed538131f0457ba31bf5557108 Mar 19 11:54:58.137973 master-0 kubenswrapper[7642]: I0319 11:54:58.135988 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j2kp7" podStartSLOduration=3.135955368 podStartE2EDuration="3.135955368s" podCreationTimestamp="2026-03-19 11:54:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:54:58.099798188 +0000 UTC m=+56.217238985" watchObservedRunningTime="2026-03-19 11:54:58.135955368 +0000 UTC m=+56.253396155" Mar 19 11:54:58.157428 master-0 kubenswrapper[7642]: I0319 11:54:58.156240 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03c3e2ba-8810-48f9-9c77-7597cc2ea6c5" path="/var/lib/kubelet/pods/03c3e2ba-8810-48f9-9c77-7597cc2ea6c5/volumes" Mar 19 11:54:58.157428 master-0 kubenswrapper[7642]: I0319 11:54:58.157278 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074a3b28-725c-4b11-b445-c5cf6511ffc8" path="/var/lib/kubelet/pods/074a3b28-725c-4b11-b445-c5cf6511ffc8/volumes" Mar 19 11:54:58.412528 master-0 kubenswrapper[7642]: I0319 11:54:58.412301 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" podStartSLOduration=7.639082757 podStartE2EDuration="23.412270434s" podCreationTimestamp="2026-03-19 11:54:35 +0000 UTC" firstStartedPulling="2026-03-19 11:54:37.435415915 +0000 UTC m=+35.552856702" lastFinishedPulling="2026-03-19 11:54:53.208603592 +0000 UTC m=+51.326044379" observedRunningTime="2026-03-19 11:54:58.164033312 +0000 UTC m=+56.281474109" watchObservedRunningTime="2026-03-19 11:54:58.412270434 +0000 UTC m=+56.529711221" Mar 19 11:54:58.420657 master-0 kubenswrapper[7642]: I0319 11:54:58.418884 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cmnd7"] Mar 19 11:54:58.420657 master-0 kubenswrapper[7642]: I0319 11:54:58.420068 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.431367 master-0 kubenswrapper[7642]: I0319 11:54:58.431308 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmnd7"] Mar 19 11:54:58.479837 master-0 kubenswrapper[7642]: I0319 11:54:58.479772 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9597m\" (UniqueName: \"kubernetes.io/projected/c5426417-d063-49e6-9fc9-f40ccf8296dc-kube-api-access-9597m\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.480102 master-0 kubenswrapper[7642]: I0319 11:54:58.479895 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-catalog-content\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.480138 master-0 kubenswrapper[7642]: I0319 11:54:58.480080 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-utilities\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.581682 master-0 kubenswrapper[7642]: I0319 11:54:58.581585 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-catalog-content\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.582155 master-0 kubenswrapper[7642]: I0319 11:54:58.582125 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-utilities\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.582248 master-0 kubenswrapper[7642]: I0319 11:54:58.582222 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9597m\" (UniqueName: \"kubernetes.io/projected/c5426417-d063-49e6-9fc9-f40ccf8296dc-kube-api-access-9597m\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.582298 master-0 kubenswrapper[7642]: I0319 11:54:58.582270 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-catalog-content\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.582715 master-0 kubenswrapper[7642]: I0319 11:54:58.582602 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-utilities\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.608746 master-0 kubenswrapper[7642]: I0319 11:54:58.608702 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9597m\" (UniqueName: \"kubernetes.io/projected/c5426417-d063-49e6-9fc9-f40ccf8296dc-kube-api-access-9597m\") pod \"redhat-marketplace-cmnd7\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:58.664357 master-0 kubenswrapper[7642]: I0319 11:54:58.663740 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:54:58.752239 master-0 kubenswrapper[7642]: I0319 11:54:58.752030 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:54:59.043067 master-0 kubenswrapper[7642]: I0319 11:54:59.039082 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_6660a89a-cb33-4e20-aa58-93c4838e44c3/installer/0.log" Mar 19 11:54:59.043067 master-0 kubenswrapper[7642]: I0319 11:54:59.039156 7642 generic.go:334] "Generic (PLEG): container finished" podID="6660a89a-cb33-4e20-aa58-93c4838e44c3" containerID="5db8b9dc9974fa7fae69b5ed92758705aaeaec268deaa769d2d9d84f72d9569a" exitCode=1 Mar 19 11:54:59.043067 master-0 kubenswrapper[7642]: I0319 11:54:59.039241 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"6660a89a-cb33-4e20-aa58-93c4838e44c3","Type":"ContainerDied","Data":"5db8b9dc9974fa7fae69b5ed92758705aaeaec268deaa769d2d9d84f72d9569a"} Mar 19 11:54:59.043983 master-0 kubenswrapper[7642]: I0319 11:54:59.043835 7642 generic.go:334] "Generic (PLEG): container finished" podID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerID="f2fccfaa5c9ec30ed365e658f9b75ddf0baabf11bac6bda0ab6dfed5d569f4f6" exitCode=0 Mar 19 11:54:59.044499 master-0 kubenswrapper[7642]: I0319 11:54:59.044117 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vj6x4" event={"ID":"aedae669-04fb-44e7-afac-03bd0bfae4e9","Type":"ContainerDied","Data":"f2fccfaa5c9ec30ed365e658f9b75ddf0baabf11bac6bda0ab6dfed5d569f4f6"} Mar 19 11:54:59.044499 master-0 kubenswrapper[7642]: I0319 11:54:59.044171 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vj6x4" event={"ID":"aedae669-04fb-44e7-afac-03bd0bfae4e9","Type":"ContainerStarted","Data":"df5f164a29ed9b0b229f66f5108c1194120dcaed538131f0457ba31bf5557108"} Mar 19 11:54:59.045532 master-0 kubenswrapper[7642]: I0319 11:54:59.045316 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" containerName="installer" containerID="cri-o://52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841" gracePeriod=30 Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.884354 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z"] Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.885267 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.888092 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.888116 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.888369 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.888446 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.888646 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 11:54:59.890393 master-0 kubenswrapper[7642]: I0319 11:54:59.888763 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 11:54:59.912028 master-0 kubenswrapper[7642]: I0319 11:54:59.897610 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 11:54:59.912028 master-0 kubenswrapper[7642]: I0319 11:54:59.897836 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 11:54:59.919415 master-0 kubenswrapper[7642]: I0319 11:54:59.918586 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z"] Mar 19 11:54:59.923355 master-0 kubenswrapper[7642]: I0319 11:54:59.923299 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-policies\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.923497 master-0 kubenswrapper[7642]: I0319 11:54:59.923398 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-serving-ca\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.923497 master-0 kubenswrapper[7642]: I0319 11:54:59.923459 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-client\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.923497 master-0 kubenswrapper[7642]: I0319 11:54:59.923495 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntj9\" (UniqueName: \"kubernetes.io/projected/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-kube-api-access-8ntj9\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.923622 master-0 kubenswrapper[7642]: I0319 11:54:59.923542 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-trusted-ca-bundle\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.923622 master-0 kubenswrapper[7642]: I0319 11:54:59.923588 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-dir\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.926109 master-0 kubenswrapper[7642]: I0319 11:54:59.924350 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-serving-cert\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:54:59.926109 master-0 kubenswrapper[7642]: I0319 11:54:59.924434 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-encryption-config\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.026414 master-0 kubenswrapper[7642]: I0319 11:55:00.026238 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-trusted-ca-bundle\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028126 master-0 kubenswrapper[7642]: I0319 11:55:00.028056 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-trusted-ca-bundle\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028289 master-0 kubenswrapper[7642]: I0319 11:55:00.028237 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-serving-cert\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028469 master-0 kubenswrapper[7642]: I0319 11:55:00.028424 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-dir\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028534 master-0 kubenswrapper[7642]: I0319 11:55:00.028477 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-encryption-config\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028583 master-0 kubenswrapper[7642]: I0319 11:55:00.028541 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-dir\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028675 master-0 kubenswrapper[7642]: I0319 11:55:00.028623 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-policies\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028758 master-0 kubenswrapper[7642]: I0319 11:55:00.028693 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-serving-ca\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028839 master-0 kubenswrapper[7642]: I0319 11:55:00.028789 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-client\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.028913 master-0 kubenswrapper[7642]: I0319 11:55:00.028863 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntj9\" (UniqueName: \"kubernetes.io/projected/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-kube-api-access-8ntj9\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.030902 master-0 kubenswrapper[7642]: I0319 11:55:00.030061 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-serving-ca\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.030902 master-0 kubenswrapper[7642]: I0319 11:55:00.030590 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-policies\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.046749 master-0 kubenswrapper[7642]: I0319 11:55:00.046663 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-serving-cert\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.049386 master-0 kubenswrapper[7642]: I0319 11:55:00.049345 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-client\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.049818 master-0 kubenswrapper[7642]: I0319 11:55:00.049756 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntj9\" (UniqueName: \"kubernetes.io/projected/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-kube-api-access-8ntj9\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.065092 master-0 kubenswrapper[7642]: I0319 11:55:00.065008 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-encryption-config\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:00.223436 master-0 kubenswrapper[7642]: I0319 11:55:00.223367 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:01.267235 master-0 kubenswrapper[7642]: I0319 11:55:01.267150 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 11:55:01.268377 master-0 kubenswrapper[7642]: I0319 11:55:01.268351 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.277737 master-0 kubenswrapper[7642]: I0319 11:55:01.274541 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 11:55:01.467884 master-0 kubenswrapper[7642]: I0319 11:55:01.467822 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6765b7-3576-4171-972b-20d62d81f25d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.468155 master-0 kubenswrapper[7642]: I0319 11:55:01.468051 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-var-lock\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.468402 master-0 kubenswrapper[7642]: I0319 11:55:01.468316 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.569847 master-0 kubenswrapper[7642]: I0319 11:55:01.569226 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-var-lock\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.569847 master-0 kubenswrapper[7642]: I0319 11:55:01.569307 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.569847 master-0 kubenswrapper[7642]: I0319 11:55:01.569431 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-var-lock\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.569847 master-0 kubenswrapper[7642]: I0319 11:55:01.569519 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6765b7-3576-4171-972b-20d62d81f25d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.569847 master-0 kubenswrapper[7642]: I0319 11:55:01.569533 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.604257 master-0 kubenswrapper[7642]: I0319 11:55:01.604202 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6765b7-3576-4171-972b-20d62d81f25d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.643747 master-0 kubenswrapper[7642]: I0319 11:55:01.643685 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:55:01.994076 master-0 kubenswrapper[7642]: I0319 11:55:01.994000 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:55:01.994319 master-0 kubenswrapper[7642]: I0319 11:55:01.994130 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:55:02.011392 master-0 kubenswrapper[7642]: I0319 11:55:02.011339 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:55:02.106196 master-0 kubenswrapper[7642]: I0319 11:55:02.106134 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 11:55:04.016575 master-0 kubenswrapper[7642]: I0319 11:55:04.010006 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7tj9"] Mar 19 11:55:04.392509 master-0 kubenswrapper[7642]: I0319 11:55:04.392000 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 11:55:04.431158 master-0 kubenswrapper[7642]: I0319 11:55:04.431068 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ql8d"] Mar 19 11:55:04.433824 master-0 kubenswrapper[7642]: I0319 11:55:04.433220 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.436084 master-0 kubenswrapper[7642]: I0319 11:55:04.435992 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-d7x5g" Mar 19 11:55:04.454712 master-0 kubenswrapper[7642]: I0319 11:55:04.451946 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ql8d"] Mar 19 11:55:04.467559 master-0 kubenswrapper[7642]: I0319 11:55:04.467502 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 11:55:04.548838 master-0 kubenswrapper[7642]: I0319 11:55:04.548767 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-utilities\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.549125 master-0 kubenswrapper[7642]: I0319 11:55:04.548869 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-catalog-content\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.549125 master-0 kubenswrapper[7642]: I0319 11:55:04.548899 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7bl2\" (UniqueName: \"kubernetes.io/projected/d979be56-f948-4252-ab6b-9bfc48dc4187-kube-api-access-w7bl2\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.650959 master-0 kubenswrapper[7642]: I0319 11:55:04.650892 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-utilities\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.651263 master-0 kubenswrapper[7642]: I0319 11:55:04.650976 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7bl2\" (UniqueName: \"kubernetes.io/projected/d979be56-f948-4252-ab6b-9bfc48dc4187-kube-api-access-w7bl2\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.651263 master-0 kubenswrapper[7642]: I0319 11:55:04.651010 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-catalog-content\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.652174 master-0 kubenswrapper[7642]: I0319 11:55:04.652124 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-catalog-content\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.652320 master-0 kubenswrapper[7642]: I0319 11:55:04.652303 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-utilities\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:04.957031 master-0 kubenswrapper[7642]: I0319 11:55:04.956866 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7bl2\" (UniqueName: \"kubernetes.io/projected/d979be56-f948-4252-ab6b-9bfc48dc4187-kube-api-access-w7bl2\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:05.075828 master-0 kubenswrapper[7642]: I0319 11:55:05.075757 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:05.883749 master-0 kubenswrapper[7642]: I0319 11:55:05.882264 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tp448"] Mar 19 11:55:05.919672 master-0 kubenswrapper[7642]: I0319 11:55:05.908837 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4h84v"] Mar 19 11:55:05.919672 master-0 kubenswrapper[7642]: I0319 11:55:05.910649 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:05.919672 master-0 kubenswrapper[7642]: I0319 11:55:05.916038 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4lzgv" Mar 19 11:55:05.938168 master-0 kubenswrapper[7642]: I0319 11:55:05.938093 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_6660a89a-cb33-4e20-aa58-93c4838e44c3/installer/0.log" Mar 19 11:55:05.938168 master-0 kubenswrapper[7642]: I0319 11:55:05.938183 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:55:05.941218 master-0 kubenswrapper[7642]: I0319 11:55:05.940753 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4h84v"] Mar 19 11:55:05.985185 master-0 kubenswrapper[7642]: I0319 11:55:05.983605 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-catalog-content\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:05.985185 master-0 kubenswrapper[7642]: I0319 11:55:05.983898 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd74j\" (UniqueName: \"kubernetes.io/projected/24bd6a9b-9934-4cf9-b591-1ab892c8014c-kube-api-access-jd74j\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:05.985185 master-0 kubenswrapper[7642]: I0319 11:55:05.983964 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-utilities\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.087189 master-0 kubenswrapper[7642]: I0319 11:55:06.087127 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-var-lock\") pod \"6660a89a-cb33-4e20-aa58-93c4838e44c3\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " Mar 19 11:55:06.087189 master-0 kubenswrapper[7642]: I0319 11:55:06.087194 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6660a89a-cb33-4e20-aa58-93c4838e44c3-kube-api-access\") pod \"6660a89a-cb33-4e20-aa58-93c4838e44c3\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " Mar 19 11:55:06.087832 master-0 kubenswrapper[7642]: I0319 11:55:06.087280 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-kubelet-dir\") pod \"6660a89a-cb33-4e20-aa58-93c4838e44c3\" (UID: \"6660a89a-cb33-4e20-aa58-93c4838e44c3\") " Mar 19 11:55:06.087832 master-0 kubenswrapper[7642]: I0319 11:55:06.087549 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-catalog-content\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.087832 master-0 kubenswrapper[7642]: I0319 11:55:06.087584 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd74j\" (UniqueName: \"kubernetes.io/projected/24bd6a9b-9934-4cf9-b591-1ab892c8014c-kube-api-access-jd74j\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.087832 master-0 kubenswrapper[7642]: I0319 11:55:06.087609 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-utilities\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.088273 master-0 kubenswrapper[7642]: I0319 11:55:06.088218 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-utilities\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.088273 master-0 kubenswrapper[7642]: I0319 11:55:06.088267 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-var-lock" (OuterVolumeSpecName: "var-lock") pod "6660a89a-cb33-4e20-aa58-93c4838e44c3" (UID: "6660a89a-cb33-4e20-aa58-93c4838e44c3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:06.088991 master-0 kubenswrapper[7642]: I0319 11:55:06.088913 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6660a89a-cb33-4e20-aa58-93c4838e44c3" (UID: "6660a89a-cb33-4e20-aa58-93c4838e44c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:06.089056 master-0 kubenswrapper[7642]: I0319 11:55:06.089033 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-catalog-content\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.099241 master-0 kubenswrapper[7642]: I0319 11:55:06.098375 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6660a89a-cb33-4e20-aa58-93c4838e44c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6660a89a-cb33-4e20-aa58-93c4838e44c3" (UID: "6660a89a-cb33-4e20-aa58-93c4838e44c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:06.122239 master-0 kubenswrapper[7642]: I0319 11:55:06.122043 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd74j\" (UniqueName: \"kubernetes.io/projected/24bd6a9b-9934-4cf9-b591-1ab892c8014c-kube-api-access-jd74j\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.168018 master-0 kubenswrapper[7642]: I0319 11:55:06.165799 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"6660a89a-cb33-4e20-aa58-93c4838e44c3","Type":"ContainerDied","Data":"34c8be87db4beba6862175ed9b1b7317ca6d98647db2c96dcbb21d3477fb6608"} Mar 19 11:55:06.168018 master-0 kubenswrapper[7642]: I0319 11:55:06.165880 7642 scope.go:117] "RemoveContainer" containerID="5db8b9dc9974fa7fae69b5ed92758705aaeaec268deaa769d2d9d84f72d9569a" Mar 19 11:55:06.168018 master-0 kubenswrapper[7642]: I0319 11:55:06.166196 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 19 11:55:06.192916 master-0 kubenswrapper[7642]: I0319 11:55:06.189996 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:06.192916 master-0 kubenswrapper[7642]: I0319 11:55:06.190033 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6660a89a-cb33-4e20-aa58-93c4838e44c3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:06.192916 master-0 kubenswrapper[7642]: I0319 11:55:06.190043 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6660a89a-cb33-4e20-aa58-93c4838e44c3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:06.272016 master-0 kubenswrapper[7642]: I0319 11:55:06.271958 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:06.281060 master-0 kubenswrapper[7642]: I0319 11:55:06.279244 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:55:06.287828 master-0 kubenswrapper[7642]: I0319 11:55:06.287768 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 19 11:55:06.559808 master-0 kubenswrapper[7642]: I0319 11:55:06.557666 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmnd7"] Mar 19 11:55:06.630791 master-0 kubenswrapper[7642]: I0319 11:55:06.630742 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ql8d"] Mar 19 11:55:06.648876 master-0 kubenswrapper[7642]: I0319 11:55:06.648792 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z"] Mar 19 11:55:06.686505 master-0 kubenswrapper[7642]: I0319 11:55:06.686458 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4h84v"] Mar 19 11:55:06.749400 master-0 kubenswrapper[7642]: I0319 11:55:06.748787 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 19 11:55:07.119583 master-0 kubenswrapper[7642]: I0319 11:55:07.118905 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:55:07.129590 master-0 kubenswrapper[7642]: I0319 11:55:07.129494 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:55:07.211186 master-0 kubenswrapper[7642]: I0319 11:55:07.211109 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" event={"ID":"7c54300b-f091-488f-b7e6-0fc27d0453e6","Type":"ContainerStarted","Data":"0229db963c55f230573a0fd2e55e57d74aca18f0edc7b0cc212192d8656cd381"} Mar 19 11:55:07.212287 master-0 kubenswrapper[7642]: I0319 11:55:07.212229 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:55:07.218172 master-0 kubenswrapper[7642]: I0319 11:55:07.218013 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" event={"ID":"07804be1-d37c-474e-a179-01ac541d9705","Type":"ContainerStarted","Data":"e30aedaab1a7306da7baa3b2d889ab22c8768e0b6b637ef8386a681ef31c78eb"} Mar 19 11:55:07.221623 master-0 kubenswrapper[7642]: I0319 11:55:07.220340 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:55:07.222744 master-0 kubenswrapper[7642]: I0319 11:55:07.222695 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pgl92" event={"ID":"a5cf6404-1640-43d3-8018-03c62c8338c5","Type":"ContainerStarted","Data":"0d98adb678ff894fda362481253a39cc289070f56a3c0fa8ab4f7475629152b0"} Mar 19 11:55:07.239475 master-0 kubenswrapper[7642]: I0319 11:55:07.239412 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:55:07.288873 master-0 kubenswrapper[7642]: I0319 11:55:07.287570 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" podStartSLOduration=11.185139371 podStartE2EDuration="20.287540746s" podCreationTimestamp="2026-03-19 11:54:47 +0000 UTC" firstStartedPulling="2026-03-19 11:54:56.906580047 +0000 UTC m=+55.024020834" lastFinishedPulling="2026-03-19 11:55:06.008981422 +0000 UTC m=+64.126422209" observedRunningTime="2026-03-19 11:55:07.253822455 +0000 UTC m=+65.371263262" watchObservedRunningTime="2026-03-19 11:55:07.287540746 +0000 UTC m=+65.404981533" Mar 19 11:55:07.755325 master-0 kubenswrapper[7642]: I0319 11:55:07.747945 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c499876f-wcrpp"] Mar 19 11:55:07.755325 master-0 kubenswrapper[7642]: I0319 11:55:07.748210 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerName="controller-manager" containerID="cri-o://5d549fc56408d3e81bd303e1acc79906265312e8b58b030cd5bf088ae404228a" gracePeriod=30 Mar 19 11:55:07.755325 master-0 kubenswrapper[7642]: I0319 11:55:07.754630 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r"] Mar 19 11:55:08.078861 master-0 kubenswrapper[7642]: I0319 11:55:08.078709 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6660a89a-cb33-4e20-aa58-93c4838e44c3" path="/var/lib/kubelet/pods/6660a89a-cb33-4e20-aa58-93c4838e44c3/volumes" Mar 19 11:55:09.239351 master-0 kubenswrapper[7642]: I0319 11:55:09.239255 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" podUID="7c54300b-f091-488f-b7e6-0fc27d0453e6" containerName="route-controller-manager" containerID="cri-o://0229db963c55f230573a0fd2e55e57d74aca18f0edc7b0cc212192d8656cd381" gracePeriod=30 Mar 19 11:55:11.090385 master-0 kubenswrapper[7642]: I0319 11:55:11.090299 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:55:11.091077 master-0 kubenswrapper[7642]: I0319 11:55:11.090655 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" containerName="installer" containerID="cri-o://d19e569b3b31b135d6d3279e4bc831adfb2a118c30cf3280473097e65fd37705" gracePeriod=30 Mar 19 11:55:11.515334 master-0 kubenswrapper[7642]: I0319 11:55:11.514693 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc"] Mar 19 11:55:11.515334 master-0 kubenswrapper[7642]: I0319 11:55:11.514961 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" podUID="60f846d9-1b3d-4b9d-b3e0-90a238e02640" containerName="cluster-version-operator" containerID="cri-o://83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c" gracePeriod=130 Mar 19 11:55:12.917373 master-0 kubenswrapper[7642]: I0319 11:55:12.917281 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 11:55:12.918209 master-0 kubenswrapper[7642]: E0319 11:55:12.917507 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6660a89a-cb33-4e20-aa58-93c4838e44c3" containerName="installer" Mar 19 11:55:12.918209 master-0 kubenswrapper[7642]: I0319 11:55:12.917564 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="6660a89a-cb33-4e20-aa58-93c4838e44c3" containerName="installer" Mar 19 11:55:12.918209 master-0 kubenswrapper[7642]: I0319 11:55:12.917699 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="6660a89a-cb33-4e20-aa58-93c4838e44c3" containerName="installer" Mar 19 11:55:12.918209 master-0 kubenswrapper[7642]: I0319 11:55:12.918169 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:12.924176 master-0 kubenswrapper[7642]: I0319 11:55:12.921077 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-fbfh9" Mar 19 11:55:13.109362 master-0 kubenswrapper[7642]: I0319 11:55:13.109190 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.109362 master-0 kubenswrapper[7642]: I0319 11:55:13.109262 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.109362 master-0 kubenswrapper[7642]: I0319 11:55:13.109310 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-var-lock\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.211253 master-0 kubenswrapper[7642]: I0319 11:55:13.211187 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.211625 master-0 kubenswrapper[7642]: I0319 11:55:13.211355 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.211625 master-0 kubenswrapper[7642]: I0319 11:55:13.211399 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-var-lock\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.211836 master-0 kubenswrapper[7642]: I0319 11:55:13.211690 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-var-lock\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.211836 master-0 kubenswrapper[7642]: I0319 11:55:13.211694 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.257519 master-0 kubenswrapper[7642]: I0319 11:55:13.257452 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 11:55:13.265841 master-0 kubenswrapper[7642]: I0319 11:55:13.265800 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kube-api-access\") pod \"installer-4-master-0\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.566229 master-0 kubenswrapper[7642]: I0319 11:55:13.566152 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:55:13.914918 master-0 kubenswrapper[7642]: I0319 11:55:13.914849 7642 patch_prober.go:28] interesting pod/controller-manager-67c499876f-wcrpp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" start-of-body= Mar 19 11:55:13.915197 master-0 kubenswrapper[7642]: I0319 11:55:13.914926 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" Mar 19 11:55:16.338412 master-0 kubenswrapper[7642]: I0319 11:55:16.338296 7642 patch_prober.go:28] interesting pod/route-controller-manager-6f69766c59-b588r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Mar 19 11:55:16.339376 master-0 kubenswrapper[7642]: I0319 11:55:16.338430 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" podUID="7c54300b-f091-488f-b7e6-0fc27d0453e6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Mar 19 11:55:20.307472 master-0 kubenswrapper[7642]: I0319 11:55:20.307180 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerStarted","Data":"da05a78204256ccd4edc262bb9b930e6804a73e9ac544e9244d9c5d71173e2a9"} Mar 19 11:55:20.309119 master-0 kubenswrapper[7642]: I0319 11:55:20.309080 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerStarted","Data":"b1eb508b26c0e3a04c674bb8b166ab63a387c4e22fe8322ed85b0094f61f22e8"} Mar 19 11:55:20.310351 master-0 kubenswrapper[7642]: I0319 11:55:20.310307 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmnd7" event={"ID":"c5426417-d063-49e6-9fc9-f40ccf8296dc","Type":"ContainerStarted","Data":"542bfeeda502ccab4e25ae324c702e09b8962ca93e8f8367750a8b97ef0ab952"} Mar 19 11:55:20.311951 master-0 kubenswrapper[7642]: I0319 11:55:20.311919 7642 generic.go:334] "Generic (PLEG): container finished" podID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerID="5d549fc56408d3e81bd303e1acc79906265312e8b58b030cd5bf088ae404228a" exitCode=0 Mar 19 11:55:20.312030 master-0 kubenswrapper[7642]: I0319 11:55:20.311992 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" event={"ID":"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9","Type":"ContainerDied","Data":"5d549fc56408d3e81bd303e1acc79906265312e8b58b030cd5bf088ae404228a"} Mar 19 11:55:20.313157 master-0 kubenswrapper[7642]: I0319 11:55:20.313125 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerStarted","Data":"d318202a66572196e0681affd0ab554e8fa6ff748647c2d4a3aedba6fee2ec96"} Mar 19 11:55:20.314074 master-0 kubenswrapper[7642]: I0319 11:55:20.314053 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"de6765b7-3576-4171-972b-20d62d81f25d","Type":"ContainerStarted","Data":"503bc58279f464c4d0d4ae650c56c03affe2a73598df280a9e969caf566182ed"} Mar 19 11:55:23.914898 master-0 kubenswrapper[7642]: I0319 11:55:23.914822 7642 patch_prober.go:28] interesting pod/controller-manager-67c499876f-wcrpp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" start-of-body= Mar 19 11:55:23.915510 master-0 kubenswrapper[7642]: I0319 11:55:23.914919 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.49:8443/healthz\": dial tcp 10.128.0.49:8443: connect: connection refused" Mar 19 11:55:24.138325 master-0 kubenswrapper[7642]: I0319 11:55:24.138280 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:55:24.279789 master-0 kubenswrapper[7642]: I0319 11:55:24.279745 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") pod \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " Mar 19 11:55:24.280103 master-0 kubenswrapper[7642]: I0319 11:55:24.280084 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") pod \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " Mar 19 11:55:24.280384 master-0 kubenswrapper[7642]: I0319 11:55:24.280367 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "60f846d9-1b3d-4b9d-b3e0-90a238e02640" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:24.280502 master-0 kubenswrapper[7642]: I0319 11:55:24.280487 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "60f846d9-1b3d-4b9d-b3e0-90a238e02640" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:24.280561 master-0 kubenswrapper[7642]: I0319 11:55:24.280300 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") pod \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " Mar 19 11:55:24.280672 master-0 kubenswrapper[7642]: I0319 11:55:24.280657 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") pod \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " Mar 19 11:55:24.280779 master-0 kubenswrapper[7642]: I0319 11:55:24.280536 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca" (OuterVolumeSpecName: "service-ca") pod "60f846d9-1b3d-4b9d-b3e0-90a238e02640" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:55:24.280869 master-0 kubenswrapper[7642]: I0319 11:55:24.280850 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") pod \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\" (UID: \"60f846d9-1b3d-4b9d-b3e0-90a238e02640\") " Mar 19 11:55:24.281273 master-0 kubenswrapper[7642]: I0319 11:55:24.281258 7642 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/60f846d9-1b3d-4b9d-b3e0-90a238e02640-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:24.281394 master-0 kubenswrapper[7642]: I0319 11:55:24.281330 7642 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:24.281456 master-0 kubenswrapper[7642]: I0319 11:55:24.281445 7642 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/60f846d9-1b3d-4b9d-b3e0-90a238e02640-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:24.292007 master-0 kubenswrapper[7642]: I0319 11:55:24.291780 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "60f846d9-1b3d-4b9d-b3e0-90a238e02640" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:24.308594 master-0 kubenswrapper[7642]: I0319 11:55:24.305960 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60f846d9-1b3d-4b9d-b3e0-90a238e02640" (UID: "60f846d9-1b3d-4b9d-b3e0-90a238e02640"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:55:24.357466 master-0 kubenswrapper[7642]: I0319 11:55:24.357426 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-q46ht_d263ddac-1ede-474f-b3bb-1f1f4f7846a3/openshift-controller-manager-operator/0.log" Mar 19 11:55:24.357574 master-0 kubenswrapper[7642]: I0319 11:55:24.357496 7642 generic.go:334] "Generic (PLEG): container finished" podID="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" containerID="1bd7cb8f485e09b49dad795958f587384c11e7992bcea8c49f1cbb606f9399d9" exitCode=1 Mar 19 11:55:24.358780 master-0 kubenswrapper[7642]: I0319 11:55:24.357586 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerDied","Data":"1bd7cb8f485e09b49dad795958f587384c11e7992bcea8c49f1cbb606f9399d9"} Mar 19 11:55:24.359480 master-0 kubenswrapper[7642]: I0319 11:55:24.359359 7642 scope.go:117] "RemoveContainer" containerID="1bd7cb8f485e09b49dad795958f587384c11e7992bcea8c49f1cbb606f9399d9" Mar 19 11:55:24.359988 master-0 kubenswrapper[7642]: I0319 11:55:24.359954 7642 generic.go:334] "Generic (PLEG): container finished" podID="b0133cb5-e425-40af-a051-71b2362a89c3" containerID="daecc00cd0269edf3ea7180e1cd52281d3c5b230d8b9db4f7b6d733857fc8257" exitCode=0 Mar 19 11:55:24.361998 master-0 kubenswrapper[7642]: I0319 11:55:24.361952 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerDied","Data":"daecc00cd0269edf3ea7180e1cd52281d3c5b230d8b9db4f7b6d733857fc8257"} Mar 19 11:55:24.362441 master-0 kubenswrapper[7642]: I0319 11:55:24.362356 7642 scope.go:117] "RemoveContainer" containerID="daecc00cd0269edf3ea7180e1cd52281d3c5b230d8b9db4f7b6d733857fc8257" Mar 19 11:55:24.377094 master-0 kubenswrapper[7642]: I0319 11:55:24.376992 7642 generic.go:334] "Generic (PLEG): container finished" podID="84e5577e-df8b-46a2-aff7-0bf90b5010d9" containerID="dd2920bde9ceef973c55e7d085b8480289be3211f9caba169d896a74981242f4" exitCode=0 Mar 19 11:55:24.377246 master-0 kubenswrapper[7642]: I0319 11:55:24.377170 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerDied","Data":"dd2920bde9ceef973c55e7d085b8480289be3211f9caba169d896a74981242f4"} Mar 19 11:55:24.384723 master-0 kubenswrapper[7642]: I0319 11:55:24.379763 7642 scope.go:117] "RemoveContainer" containerID="dd2920bde9ceef973c55e7d085b8480289be3211f9caba169d896a74981242f4" Mar 19 11:55:24.384723 master-0 kubenswrapper[7642]: I0319 11:55:24.384041 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f846d9-1b3d-4b9d-b3e0-90a238e02640-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:24.384723 master-0 kubenswrapper[7642]: I0319 11:55:24.384070 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f846d9-1b3d-4b9d-b3e0-90a238e02640-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:24.433216 master-0 kubenswrapper[7642]: I0319 11:55:24.433168 7642 generic.go:334] "Generic (PLEG): container finished" podID="ac946049-4b93-4d8c-bfa6-eacf83160ed1" containerID="d611dd027fd146af69473e5540a6fda94c8c6de39cd053174307e17bd337e0ce" exitCode=0 Mar 19 11:55:24.433422 master-0 kubenswrapper[7642]: I0319 11:55:24.433278 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerDied","Data":"d611dd027fd146af69473e5540a6fda94c8c6de39cd053174307e17bd337e0ce"} Mar 19 11:55:24.433772 master-0 kubenswrapper[7642]: I0319 11:55:24.433730 7642 scope.go:117] "RemoveContainer" containerID="d611dd027fd146af69473e5540a6fda94c8c6de39cd053174307e17bd337e0ce" Mar 19 11:55:24.458484 master-0 kubenswrapper[7642]: I0319 11:55:24.458442 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_751316a7-cd5f-4787-b24f-f1cddeb6a29e/installer/0.log" Mar 19 11:55:24.458774 master-0 kubenswrapper[7642]: I0319 11:55:24.458733 7642 generic.go:334] "Generic (PLEG): container finished" podID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" containerID="d19e569b3b31b135d6d3279e4bc831adfb2a118c30cf3280473097e65fd37705" exitCode=1 Mar 19 11:55:24.458936 master-0 kubenswrapper[7642]: I0319 11:55:24.458916 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"751316a7-cd5f-4787-b24f-f1cddeb6a29e","Type":"ContainerDied","Data":"d19e569b3b31b135d6d3279e4bc831adfb2a118c30cf3280473097e65fd37705"} Mar 19 11:55:24.471042 master-0 kubenswrapper[7642]: I0319 11:55:24.470945 7642 generic.go:334] "Generic (PLEG): container finished" podID="162b513f-2f67-4946-816b-48f6232dfca0" containerID="a82bb27ad062e4cf63ae115331a67a108d2b42bc563ab5c88f2c0b48ab45fdcd" exitCode=0 Mar 19 11:55:24.471262 master-0 kubenswrapper[7642]: I0319 11:55:24.471237 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerDied","Data":"a82bb27ad062e4cf63ae115331a67a108d2b42bc563ab5c88f2c0b48ab45fdcd"} Mar 19 11:55:24.472506 master-0 kubenswrapper[7642]: I0319 11:55:24.472489 7642 scope.go:117] "RemoveContainer" containerID="a82bb27ad062e4cf63ae115331a67a108d2b42bc563ab5c88f2c0b48ab45fdcd" Mar 19 11:55:24.493459 master-0 kubenswrapper[7642]: I0319 11:55:24.492837 7642 generic.go:334] "Generic (PLEG): container finished" podID="60f846d9-1b3d-4b9d-b3e0-90a238e02640" containerID="83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c" exitCode=0 Mar 19 11:55:24.493459 master-0 kubenswrapper[7642]: I0319 11:55:24.492964 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" event={"ID":"60f846d9-1b3d-4b9d-b3e0-90a238e02640","Type":"ContainerDied","Data":"83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c"} Mar 19 11:55:24.493459 master-0 kubenswrapper[7642]: I0319 11:55:24.493001 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" event={"ID":"60f846d9-1b3d-4b9d-b3e0-90a238e02640","Type":"ContainerDied","Data":"8e4e3b487bb0051c9d1f17e8f20f2f36c1feaa2ba16b77ca2fa3242c79022839"} Mar 19 11:55:24.493459 master-0 kubenswrapper[7642]: I0319 11:55:24.493026 7642 scope.go:117] "RemoveContainer" containerID="83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c" Mar 19 11:55:24.493459 master-0 kubenswrapper[7642]: I0319 11:55:24.493182 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc" Mar 19 11:55:24.499983 master-0 kubenswrapper[7642]: I0319 11:55:24.498595 7642 generic.go:334] "Generic (PLEG): container finished" podID="7c54300b-f091-488f-b7e6-0fc27d0453e6" containerID="0229db963c55f230573a0fd2e55e57d74aca18f0edc7b0cc212192d8656cd381" exitCode=0 Mar 19 11:55:24.499983 master-0 kubenswrapper[7642]: I0319 11:55:24.498688 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" event={"ID":"7c54300b-f091-488f-b7e6-0fc27d0453e6","Type":"ContainerDied","Data":"0229db963c55f230573a0fd2e55e57d74aca18f0edc7b0cc212192d8656cd381"} Mar 19 11:55:24.534566 master-0 kubenswrapper[7642]: I0319 11:55:24.534499 7642 generic.go:334] "Generic (PLEG): container finished" podID="3f3c0234-248a-4d6f-9aca-6582a481a911" containerID="b9318bc6665c43cc333e2980698d6c509e0e61367154c5acfc00fbbe7da6a9de" exitCode=0 Mar 19 11:55:24.534794 master-0 kubenswrapper[7642]: I0319 11:55:24.534569 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerDied","Data":"b9318bc6665c43cc333e2980698d6c509e0e61367154c5acfc00fbbe7da6a9de"} Mar 19 11:55:24.535516 master-0 kubenswrapper[7642]: I0319 11:55:24.535432 7642 scope.go:117] "RemoveContainer" containerID="b9318bc6665c43cc333e2980698d6c509e0e61367154c5acfc00fbbe7da6a9de" Mar 19 11:55:24.666505 master-0 kubenswrapper[7642]: I0319 11:55:24.666456 7642 scope.go:117] "RemoveContainer" containerID="83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c" Mar 19 11:55:24.673116 master-0 kubenswrapper[7642]: E0319 11:55:24.673076 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c\": container with ID starting with 83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c not found: ID does not exist" containerID="83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c" Mar 19 11:55:24.673191 master-0 kubenswrapper[7642]: I0319 11:55:24.673118 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c"} err="failed to get container status \"83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c\": rpc error: code = NotFound desc = could not find container \"83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c\": container with ID starting with 83fb187aad06bab0ba18f4f069fad0a0fa61878618ee6814d1c0de20e0b2e60c not found: ID does not exist" Mar 19 11:55:24.935195 master-0 kubenswrapper[7642]: I0319 11:55:24.935026 7642 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 11:55:25.016001 master-0 kubenswrapper[7642]: I0319 11:55:25.010387 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:55:25.016001 master-0 kubenswrapper[7642]: I0319 11:55:25.014484 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_751316a7-cd5f-4787-b24f-f1cddeb6a29e/installer/0.log" Mar 19 11:55:25.016001 master-0 kubenswrapper[7642]: I0319 11:55:25.014575 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:55:25.018232 master-0 kubenswrapper[7642]: I0319 11:55:25.018024 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc"] Mar 19 11:55:25.023568 master-0 kubenswrapper[7642]: I0319 11:55:25.023171 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-56d8475767-nvbgc"] Mar 19 11:55:25.045219 master-0 kubenswrapper[7642]: I0319 11:55:25.045118 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b"] Mar 19 11:55:25.050611 master-0 kubenswrapper[7642]: I0319 11:55:25.050257 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.106941 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-config\") pod \"7c54300b-f091-488f-b7e6-0fc27d0453e6\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.107062 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-client-ca\") pod \"7c54300b-f091-488f-b7e6-0fc27d0453e6\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.107156 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c54300b-f091-488f-b7e6-0fc27d0453e6-serving-cert\") pod \"7c54300b-f091-488f-b7e6-0fc27d0453e6\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.107183 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kubelet-dir\") pod \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.107259 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kube-api-access\") pod \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.107327 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-var-lock\") pod \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\" (UID: \"751316a7-cd5f-4787-b24f-f1cddeb6a29e\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.107355 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtrk8\" (UniqueName: \"kubernetes.io/projected/7c54300b-f091-488f-b7e6-0fc27d0453e6-kube-api-access-qtrk8\") pod \"7c54300b-f091-488f-b7e6-0fc27d0453e6\" (UID: \"7c54300b-f091-488f-b7e6-0fc27d0453e6\") " Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.108399 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-config" (OuterVolumeSpecName: "config") pod "7c54300b-f091-488f-b7e6-0fc27d0453e6" (UID: "7c54300b-f091-488f-b7e6-0fc27d0453e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.108749 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "751316a7-cd5f-4787-b24f-f1cddeb6a29e" (UID: "751316a7-cd5f-4787-b24f-f1cddeb6a29e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.108781 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-var-lock" (OuterVolumeSpecName: "var-lock") pod "751316a7-cd5f-4787-b24f-f1cddeb6a29e" (UID: "751316a7-cd5f-4787-b24f-f1cddeb6a29e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.111968 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "7c54300b-f091-488f-b7e6-0fc27d0453e6" (UID: "7c54300b-f091-488f-b7e6-0fc27d0453e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:55:25.113706 master-0 kubenswrapper[7642]: I0319 11:55:25.112849 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c54300b-f091-488f-b7e6-0fc27d0453e6-kube-api-access-qtrk8" (OuterVolumeSpecName: "kube-api-access-qtrk8") pod "7c54300b-f091-488f-b7e6-0fc27d0453e6" (UID: "7c54300b-f091-488f-b7e6-0fc27d0453e6"). InnerVolumeSpecName "kube-api-access-qtrk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:25.114522 master-0 kubenswrapper[7642]: I0319 11:55:25.114010 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c54300b-f091-488f-b7e6-0fc27d0453e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7c54300b-f091-488f-b7e6-0fc27d0453e6" (UID: "7c54300b-f091-488f-b7e6-0fc27d0453e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:55:25.117424 master-0 kubenswrapper[7642]: I0319 11:55:25.117231 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "751316a7-cd5f-4787-b24f-f1cddeb6a29e" (UID: "751316a7-cd5f-4787-b24f-f1cddeb6a29e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.123304 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq"] Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: E0319 11:55:25.123596 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f846d9-1b3d-4b9d-b3e0-90a238e02640" containerName="cluster-version-operator" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.123610 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f846d9-1b3d-4b9d-b3e0-90a238e02640" containerName="cluster-version-operator" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: E0319 11:55:25.123622 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c54300b-f091-488f-b7e6-0fc27d0453e6" containerName="route-controller-manager" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.123629 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c54300b-f091-488f-b7e6-0fc27d0453e6" containerName="route-controller-manager" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: E0319 11:55:25.124746 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" containerName="installer" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.124756 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" containerName="installer" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.124881 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c54300b-f091-488f-b7e6-0fc27d0453e6" containerName="route-controller-manager" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.124894 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f846d9-1b3d-4b9d-b3e0-90a238e02640" containerName="cluster-version-operator" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.124905 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" containerName="installer" Mar 19 11:55:25.126453 master-0 kubenswrapper[7642]: I0319 11:55:25.125343 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.132788 master-0 kubenswrapper[7642]: I0319 11:55:25.131882 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 11:55:25.132788 master-0 kubenswrapper[7642]: I0319 11:55:25.132285 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 11:55:25.132788 master-0 kubenswrapper[7642]: I0319 11:55:25.132297 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 11:55:25.208373 master-0 kubenswrapper[7642]: I0319 11:55:25.208333 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208380 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208415 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208445 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65be54c-d851-4daa-93a0-8a3947da5e26-kube-api-access\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208478 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208702 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208793 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtrk8\" (UniqueName: \"kubernetes.io/projected/7c54300b-f091-488f-b7e6-0fc27d0453e6-kube-api-access-qtrk8\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208829 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208842 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c54300b-f091-488f-b7e6-0fc27d0453e6-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208855 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c54300b-f091-488f-b7e6-0fc27d0453e6-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208869 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.209067 master-0 kubenswrapper[7642]: I0319 11:55:25.208979 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/751316a7-cd5f-4787-b24f-f1cddeb6a29e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.255786 master-0 kubenswrapper[7642]: I0319 11:55:25.252924 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:55:25.312499 master-0 kubenswrapper[7642]: I0319 11:55:25.311302 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.312499 master-0 kubenswrapper[7642]: I0319 11:55:25.311378 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.312499 master-0 kubenswrapper[7642]: I0319 11:55:25.311439 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.312499 master-0 kubenswrapper[7642]: I0319 11:55:25.311471 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65be54c-d851-4daa-93a0-8a3947da5e26-kube-api-access\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.312499 master-0 kubenswrapper[7642]: I0319 11:55:25.311517 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.312499 master-0 kubenswrapper[7642]: I0319 11:55:25.311630 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.313239 master-0 kubenswrapper[7642]: I0319 11:55:25.312866 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.313239 master-0 kubenswrapper[7642]: I0319 11:55:25.313114 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.324178 master-0 kubenswrapper[7642]: I0319 11:55:25.324104 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.335097 master-0 kubenswrapper[7642]: I0319 11:55:25.335047 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65be54c-d851-4daa-93a0-8a3947da5e26-kube-api-access\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.410063 master-0 kubenswrapper[7642]: I0319 11:55:25.409627 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.413628 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-serving-cert\") pod \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.413723 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kjhn\" (UniqueName: \"kubernetes.io/projected/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-kube-api-access-9kjhn\") pod \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.413755 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-client-ca\") pod \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.413784 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-config\") pod \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.414030 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-proxy-ca-bundles\") pod \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\" (UID: \"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9\") " Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.414999 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" (UID: "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:55:25.418088 master-0 kubenswrapper[7642]: I0319 11:55:25.415843 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" (UID: "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:55:25.419334 master-0 kubenswrapper[7642]: I0319 11:55:25.419285 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-config" (OuterVolumeSpecName: "config") pod "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" (UID: "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:55:25.448675 master-0 kubenswrapper[7642]: I0319 11:55:25.439327 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" (UID: "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:55:25.448675 master-0 kubenswrapper[7642]: I0319 11:55:25.443023 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-kube-api-access-9kjhn" (OuterVolumeSpecName: "kube-api-access-9kjhn") pod "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" (UID: "3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9"). InnerVolumeSpecName "kube-api-access-9kjhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:25.518453 master-0 kubenswrapper[7642]: I0319 11:55:25.515609 7642 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.518453 master-0 kubenswrapper[7642]: I0319 11:55:25.516901 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kjhn\" (UniqueName: \"kubernetes.io/projected/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-kube-api-access-9kjhn\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.518453 master-0 kubenswrapper[7642]: I0319 11:55:25.516924 7642 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.518453 master-0 kubenswrapper[7642]: I0319 11:55:25.516943 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.518453 master-0 kubenswrapper[7642]: I0319 11:55:25.516956 7642 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:25.564914 master-0 kubenswrapper[7642]: I0319 11:55:25.564828 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"de6765b7-3576-4171-972b-20d62d81f25d","Type":"ContainerStarted","Data":"f6792bda14390e5172a259445357feef34047b3b4035a2f72b67e0cd845a049a"} Mar 19 11:55:25.583384 master-0 kubenswrapper[7642]: I0319 11:55:25.581849 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"8599681efde4807c1ee51fb15ce18a51bf6eae443d3b45fdd6b7cd611d18283f"} Mar 19 11:55:25.597244 master-0 kubenswrapper[7642]: I0319 11:55:25.597066 7642 generic.go:334] "Generic (PLEG): container finished" podID="d979be56-f948-4252-ab6b-9bfc48dc4187" containerID="9340a8d37e83e7908c779e917b656b3529e86ab0084fd52b34de9d95fe7b99e9" exitCode=0 Mar 19 11:55:25.597244 master-0 kubenswrapper[7642]: I0319 11:55:25.597161 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerDied","Data":"9340a8d37e83e7908c779e917b656b3529e86ab0084fd52b34de9d95fe7b99e9"} Mar 19 11:55:25.603366 master-0 kubenswrapper[7642]: I0319 11:55:25.603315 7642 generic.go:334] "Generic (PLEG): container finished" podID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerID="6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f" exitCode=0 Mar 19 11:55:25.603456 master-0 kubenswrapper[7642]: I0319 11:55:25.603423 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmnd7" event={"ID":"c5426417-d063-49e6-9fc9-f40ccf8296dc","Type":"ContainerDied","Data":"6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f"} Mar 19 11:55:25.604971 master-0 kubenswrapper[7642]: I0319 11:55:25.604806 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=24.6047921 podStartE2EDuration="24.6047921s" podCreationTimestamp="2026-03-19 11:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:55:25.604022758 +0000 UTC m=+83.721463555" watchObservedRunningTime="2026-03-19 11:55:25.6047921 +0000 UTC m=+83.722232887" Mar 19 11:55:25.637748 master-0 kubenswrapper[7642]: I0319 11:55:25.634536 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"0cc3ae6879b287fcc4bae5ddccb1f2e7e024042e8efb6dfd62d69b2efa286229"} Mar 19 11:55:25.645689 master-0 kubenswrapper[7642]: I0319 11:55:25.643412 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" event={"ID":"3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9","Type":"ContainerDied","Data":"294c8e92aed8170867fe457688347ef2302a1dc291b0d68c0ca5a9868628adf8"} Mar 19 11:55:25.645689 master-0 kubenswrapper[7642]: I0319 11:55:25.643521 7642 scope.go:117] "RemoveContainer" containerID="5d549fc56408d3e81bd303e1acc79906265312e8b58b030cd5bf088ae404228a" Mar 19 11:55:25.645689 master-0 kubenswrapper[7642]: I0319 11:55:25.643709 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c499876f-wcrpp" Mar 19 11:55:25.654907 master-0 kubenswrapper[7642]: I0319 11:55:25.654206 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"6ec837485683b9ef9f8903f02c33b71f9f384c302a364201ed3b85fde2544f12"} Mar 19 11:55:25.666113 master-0 kubenswrapper[7642]: I0319 11:55:25.664029 7642 generic.go:334] "Generic (PLEG): container finished" podID="24bd6a9b-9934-4cf9-b591-1ab892c8014c" containerID="237540f70e2c29f3039bdaf2115302712ba9278f06779f1d068dbc0f51f6e8cd" exitCode=0 Mar 19 11:55:25.666113 master-0 kubenswrapper[7642]: I0319 11:55:25.664115 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerDied","Data":"237540f70e2c29f3039bdaf2115302712ba9278f06779f1d068dbc0f51f6e8cd"} Mar 19 11:55:25.725105 master-0 kubenswrapper[7642]: I0319 11:55:25.722870 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7tj9" event={"ID":"6191b9b7-8e43-4006-ab25-734c23d34cf6","Type":"ContainerStarted","Data":"8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db"} Mar 19 11:55:25.725105 master-0 kubenswrapper[7642]: I0319 11:55:25.723349 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7tj9" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerName="extract-content" containerID="cri-o://8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db" gracePeriod=2 Mar 19 11:55:25.746671 master-0 kubenswrapper[7642]: I0319 11:55:25.743185 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" event={"ID":"7c54300b-f091-488f-b7e6-0fc27d0453e6","Type":"ContainerDied","Data":"dfc938e0e6a4a3e0f7d0726d217078fc3d8cea145d7766d9bcc12fedb3b9c430"} Mar 19 11:55:25.746671 master-0 kubenswrapper[7642]: I0319 11:55:25.743252 7642 scope.go:117] "RemoveContainer" containerID="0229db963c55f230573a0fd2e55e57d74aca18f0edc7b0cc212192d8656cd381" Mar 19 11:55:25.746671 master-0 kubenswrapper[7642]: I0319 11:55:25.743693 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r" Mar 19 11:55:25.769276 master-0 kubenswrapper[7642]: I0319 11:55:25.768273 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"a5cb97fd211a0ea441e72f2a15543fd5127539a45c24f3604c7d9c65dbb4100e"} Mar 19 11:55:25.775655 master-0 kubenswrapper[7642]: I0319 11:55:25.773189 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerStarted","Data":"2ae789390c2dc2661b322d4ce95cf71fe928a1384e00118bd9c25a6d60f35352"} Mar 19 11:55:25.778876 master-0 kubenswrapper[7642]: I0319 11:55:25.776318 7642 generic.go:334] "Generic (PLEG): container finished" podID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerID="d283f45f04d7e8cff0f80d74c472dec27a65f0c7bc53da03218b4dd93a10eecd" exitCode=0 Mar 19 11:55:25.778876 master-0 kubenswrapper[7642]: I0319 11:55:25.776463 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tp448" event={"ID":"42edeefc-c920-4cd7-88b0-8c1e660abf63","Type":"ContainerDied","Data":"d283f45f04d7e8cff0f80d74c472dec27a65f0c7bc53da03218b4dd93a10eecd"} Mar 19 11:55:25.792337 master-0 kubenswrapper[7642]: I0319 11:55:25.792194 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-q46ht_d263ddac-1ede-474f-b3bb-1f1f4f7846a3/openshift-controller-manager-operator/0.log" Mar 19 11:55:25.792595 master-0 kubenswrapper[7642]: I0319 11:55:25.792381 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerStarted","Data":"7fc3d4238bc0fa4082c65174aac64034c89fc65c891756149ed46eacde6bb87a"} Mar 19 11:55:25.803668 master-0 kubenswrapper[7642]: I0319 11:55:25.796188 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"c0d8ad30-feb0-453f-b652-36aafc7b24f9","Type":"ContainerStarted","Data":"ab9b7175a769b8cf6fa9915cafc7ebea7b9ffcfdeed5fdbb917a19708d7ff10e"} Mar 19 11:55:25.803668 master-0 kubenswrapper[7642]: I0319 11:55:25.798567 7642 generic.go:334] "Generic (PLEG): container finished" podID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerID="ceb2bdf6c11a6f05383bb7f4043b30c9b6ea152730f75477c91bd1a079edfb21" exitCode=0 Mar 19 11:55:25.803668 master-0 kubenswrapper[7642]: I0319 11:55:25.798621 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vj6x4" event={"ID":"aedae669-04fb-44e7-afac-03bd0bfae4e9","Type":"ContainerDied","Data":"ceb2bdf6c11a6f05383bb7f4043b30c9b6ea152730f75477c91bd1a079edfb21"} Mar 19 11:55:25.803668 master-0 kubenswrapper[7642]: I0319 11:55:25.800830 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" event={"ID":"f65be54c-d851-4daa-93a0-8a3947da5e26","Type":"ContainerStarted","Data":"294f59c88ee9d0ef672e140bfaeda4c298629a073fa82f34bad6f87890b02c82"} Mar 19 11:55:25.803668 master-0 kubenswrapper[7642]: I0319 11:55:25.802824 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pgl92" event={"ID":"a5cf6404-1640-43d3-8018-03c62c8338c5","Type":"ContainerStarted","Data":"d42385be491bfadd3c061b3f36ad26a198c0870f8b5d2bbe763f20ae41fee79a"} Mar 19 11:55:25.804629 master-0 kubenswrapper[7642]: I0319 11:55:25.803712 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pgl92" Mar 19 11:55:25.816779 master-0 kubenswrapper[7642]: I0319 11:55:25.813220 7642 generic.go:334] "Generic (PLEG): container finished" podID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerID="d8055526d32c6d4c79231e8f49c9d8851d0cddeac9c1e90af414749e5818720e" exitCode=0 Mar 19 11:55:25.816779 master-0 kubenswrapper[7642]: I0319 11:55:25.815607 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pgl92" Mar 19 11:55:25.816779 master-0 kubenswrapper[7642]: I0319 11:55:25.815688 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerDied","Data":"d8055526d32c6d4c79231e8f49c9d8851d0cddeac9c1e90af414749e5818720e"} Mar 19 11:55:25.838227 master-0 kubenswrapper[7642]: I0319 11:55:25.821570 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_751316a7-cd5f-4787-b24f-f1cddeb6a29e/installer/0.log" Mar 19 11:55:25.838227 master-0 kubenswrapper[7642]: I0319 11:55:25.822159 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 19 11:55:25.838227 master-0 kubenswrapper[7642]: I0319 11:55:25.823056 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"751316a7-cd5f-4787-b24f-f1cddeb6a29e","Type":"ContainerDied","Data":"8dc3df7d917e038d3431655b50e062ea05b3b2442a7a61484be684526e4eb3a5"} Mar 19 11:55:25.838227 master-0 kubenswrapper[7642]: I0319 11:55:25.825247 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"0e4b6062f9bb4931c27706fc37084b485dc72bc8ae6a89fcc0aec8ff5bbecaef"} Mar 19 11:55:25.838227 master-0 kubenswrapper[7642]: I0319 11:55:25.825292 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"9dd3eeed986a63782457294b98f428fdf5ac5bd98b40cab991676824cbd680b6"} Mar 19 11:55:25.838227 master-0 kubenswrapper[7642]: I0319 11:55:25.825835 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:55:25.905126 master-0 kubenswrapper[7642]: I0319 11:55:25.903805 7642 scope.go:117] "RemoveContainer" containerID="d19e569b3b31b135d6d3279e4bc831adfb2a118c30cf3280473097e65fd37705" Mar 19 11:55:25.948397 master-0 kubenswrapper[7642]: I0319 11:55:25.947283 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c499876f-wcrpp"] Mar 19 11:55:25.962141 master-0 kubenswrapper[7642]: I0319 11:55:25.962001 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c499876f-wcrpp"] Mar 19 11:55:26.051425 master-0 kubenswrapper[7642]: I0319 11:55:26.049683 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=14.049654316 podStartE2EDuration="14.049654316s" podCreationTimestamp="2026-03-19 11:55:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:55:26.038091979 +0000 UTC m=+84.155532766" watchObservedRunningTime="2026-03-19 11:55:26.049654316 +0000 UTC m=+84.167095103" Mar 19 11:55:26.063237 master-0 kubenswrapper[7642]: I0319 11:55:26.063145 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:55:26.078847 master-0 kubenswrapper[7642]: I0319 11:55:26.078777 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" path="/var/lib/kubelet/pods/3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9/volumes" Mar 19 11:55:26.079843 master-0 kubenswrapper[7642]: I0319 11:55:26.079721 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f846d9-1b3d-4b9d-b3e0-90a238e02640" path="/var/lib/kubelet/pods/60f846d9-1b3d-4b9d-b3e0-90a238e02640/volumes" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080111 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: E0319 11:55:26.080291 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080302 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: E0319 11:55:26.080329 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerName="controller-manager" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080336 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerName="controller-manager" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: E0319 11:55:26.080351 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080359 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080458 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ccfe3c2-cb10-47be-8cf2-d28beb24bdd9" containerName="controller-manager" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080468 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" Mar 19 11:55:26.081274 master-0 kubenswrapper[7642]: I0319 11:55:26.080483 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" Mar 19 11:55:26.085227 master-0 kubenswrapper[7642]: I0319 11:55:26.084749 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.101158 master-0 kubenswrapper[7642]: I0319 11:55:26.100063 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcdctl" containerID="cri-o://17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab" gracePeriod=30 Mar 19 11:55:26.101158 master-0 kubenswrapper[7642]: I0319 11:55:26.100335 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="d664a6d0d2a24360dee10612610f1b59" containerName="etcd" containerID="cri-o://feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e" gracePeriod=30 Mar 19 11:55:26.254591 master-0 kubenswrapper[7642]: I0319 11:55:26.254536 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.254591 master-0 kubenswrapper[7642]: I0319 11:55:26.254591 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.254734 master-0 kubenswrapper[7642]: I0319 11:55:26.254615 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.254734 master-0 kubenswrapper[7642]: I0319 11:55:26.254707 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.254734 master-0 kubenswrapper[7642]: I0319 11:55:26.254732 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.254833 master-0 kubenswrapper[7642]: I0319 11:55:26.254755 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.313715 master-0 kubenswrapper[7642]: I0319 11:55:26.300891 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:55:26.357537 master-0 kubenswrapper[7642]: I0319 11:55:26.357393 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357725 master-0 kubenswrapper[7642]: I0319 11:55:26.357591 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357725 master-0 kubenswrapper[7642]: I0319 11:55:26.357707 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357849 master-0 kubenswrapper[7642]: I0319 11:55:26.357745 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357849 master-0 kubenswrapper[7642]: I0319 11:55:26.357716 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357849 master-0 kubenswrapper[7642]: I0319 11:55:26.357811 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357849 master-0 kubenswrapper[7642]: I0319 11:55:26.357833 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357849 master-0 kubenswrapper[7642]: I0319 11:55:26.357846 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357997 master-0 kubenswrapper[7642]: I0319 11:55:26.357861 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357997 master-0 kubenswrapper[7642]: I0319 11:55:26.357878 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357997 master-0 kubenswrapper[7642]: I0319 11:55:26.357889 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.357997 master-0 kubenswrapper[7642]: I0319 11:55:26.357917 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"etcd-master-0\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " pod="openshift-etcd/etcd-master-0" Mar 19 11:55:26.435570 master-0 kubenswrapper[7642]: I0319 11:55:26.432905 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7tj9_6191b9b7-8e43-4006-ab25-734c23d34cf6/extract-content/0.log" Mar 19 11:55:26.435570 master-0 kubenswrapper[7642]: I0319 11:55:26.434283 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:55:26.459356 master-0 kubenswrapper[7642]: I0319 11:55:26.459289 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-catalog-content\") pod \"42edeefc-c920-4cd7-88b0-8c1e660abf63\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " Mar 19 11:55:26.459356 master-0 kubenswrapper[7642]: I0319 11:55:26.459349 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-utilities\") pod \"42edeefc-c920-4cd7-88b0-8c1e660abf63\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " Mar 19 11:55:26.459579 master-0 kubenswrapper[7642]: I0319 11:55:26.459494 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjb78\" (UniqueName: \"kubernetes.io/projected/42edeefc-c920-4cd7-88b0-8c1e660abf63-kube-api-access-vjb78\") pod \"42edeefc-c920-4cd7-88b0-8c1e660abf63\" (UID: \"42edeefc-c920-4cd7-88b0-8c1e660abf63\") " Mar 19 11:55:26.460942 master-0 kubenswrapper[7642]: I0319 11:55:26.460900 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-utilities" (OuterVolumeSpecName: "utilities") pod "42edeefc-c920-4cd7-88b0-8c1e660abf63" (UID: "42edeefc-c920-4cd7-88b0-8c1e660abf63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:55:26.463617 master-0 kubenswrapper[7642]: I0319 11:55:26.463569 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42edeefc-c920-4cd7-88b0-8c1e660abf63-kube-api-access-vjb78" (OuterVolumeSpecName: "kube-api-access-vjb78") pod "42edeefc-c920-4cd7-88b0-8c1e660abf63" (UID: "42edeefc-c920-4cd7-88b0-8c1e660abf63"). InnerVolumeSpecName "kube-api-access-vjb78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:26.480122 master-0 kubenswrapper[7642]: I0319 11:55:26.479995 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32/installer/0.log" Mar 19 11:55:26.480122 master-0 kubenswrapper[7642]: I0319 11:55:26.480081 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:55:26.533373 master-0 kubenswrapper[7642]: I0319 11:55:26.531092 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42edeefc-c920-4cd7-88b0-8c1e660abf63" (UID: "42edeefc-c920-4cd7-88b0-8c1e660abf63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:55:26.561218 master-0 kubenswrapper[7642]: I0319 11:55:26.561085 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-var-lock\") pod \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " Mar 19 11:55:26.561218 master-0 kubenswrapper[7642]: I0319 11:55:26.561172 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-utilities\") pod \"6191b9b7-8e43-4006-ab25-734c23d34cf6\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " Mar 19 11:55:26.561218 master-0 kubenswrapper[7642]: I0319 11:55:26.561193 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-catalog-content\") pod \"6191b9b7-8e43-4006-ab25-734c23d34cf6\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " Mar 19 11:55:26.561696 master-0 kubenswrapper[7642]: I0319 11:55:26.561252 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kube-api-access\") pod \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " Mar 19 11:55:26.561696 master-0 kubenswrapper[7642]: I0319 11:55:26.561294 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkcbb\" (UniqueName: \"kubernetes.io/projected/6191b9b7-8e43-4006-ab25-734c23d34cf6-kube-api-access-jkcbb\") pod \"6191b9b7-8e43-4006-ab25-734c23d34cf6\" (UID: \"6191b9b7-8e43-4006-ab25-734c23d34cf6\") " Mar 19 11:55:26.561696 master-0 kubenswrapper[7642]: I0319 11:55:26.561358 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kubelet-dir\") pod \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\" (UID: \"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32\") " Mar 19 11:55:26.561696 master-0 kubenswrapper[7642]: I0319 11:55:26.561596 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjb78\" (UniqueName: \"kubernetes.io/projected/42edeefc-c920-4cd7-88b0-8c1e660abf63-kube-api-access-vjb78\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.561696 master-0 kubenswrapper[7642]: I0319 11:55:26.561647 7642 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.561696 master-0 kubenswrapper[7642]: I0319 11:55:26.561661 7642 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42edeefc-c920-4cd7-88b0-8c1e660abf63-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.561957 master-0 kubenswrapper[7642]: I0319 11:55:26.561730 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" (UID: "a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:26.561957 master-0 kubenswrapper[7642]: I0319 11:55:26.561768 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-var-lock" (OuterVolumeSpecName: "var-lock") pod "a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" (UID: "a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:26.562480 master-0 kubenswrapper[7642]: I0319 11:55:26.562443 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-utilities" (OuterVolumeSpecName: "utilities") pod "6191b9b7-8e43-4006-ab25-734c23d34cf6" (UID: "6191b9b7-8e43-4006-ab25-734c23d34cf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:55:26.577649 master-0 kubenswrapper[7642]: I0319 11:55:26.577577 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6191b9b7-8e43-4006-ab25-734c23d34cf6-kube-api-access-jkcbb" (OuterVolumeSpecName: "kube-api-access-jkcbb") pod "6191b9b7-8e43-4006-ab25-734c23d34cf6" (UID: "6191b9b7-8e43-4006-ab25-734c23d34cf6"). InnerVolumeSpecName "kube-api-access-jkcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:26.577793 master-0 kubenswrapper[7642]: I0319 11:55:26.577684 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" (UID: "a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:26.654090 master-0 kubenswrapper[7642]: I0319 11:55:26.654038 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6191b9b7-8e43-4006-ab25-734c23d34cf6" (UID: "6191b9b7-8e43-4006-ab25-734c23d34cf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:55:26.663049 master-0 kubenswrapper[7642]: I0319 11:55:26.663015 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.663049 master-0 kubenswrapper[7642]: I0319 11:55:26.663047 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.663221 master-0 kubenswrapper[7642]: I0319 11:55:26.663059 7642 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.663221 master-0 kubenswrapper[7642]: I0319 11:55:26.663069 7642 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6191b9b7-8e43-4006-ab25-734c23d34cf6-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.663221 master-0 kubenswrapper[7642]: I0319 11:55:26.663079 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.663221 master-0 kubenswrapper[7642]: I0319 11:55:26.663090 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkcbb\" (UniqueName: \"kubernetes.io/projected/6191b9b7-8e43-4006-ab25-734c23d34cf6-kube-api-access-jkcbb\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:26.834627 master-0 kubenswrapper[7642]: I0319 11:55:26.833675 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-z7tj9_6191b9b7-8e43-4006-ab25-734c23d34cf6/extract-content/0.log" Mar 19 11:55:26.834627 master-0 kubenswrapper[7642]: I0319 11:55:26.834570 7642 generic.go:334] "Generic (PLEG): container finished" podID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerID="8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db" exitCode=2 Mar 19 11:55:26.835005 master-0 kubenswrapper[7642]: I0319 11:55:26.834687 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7tj9" event={"ID":"6191b9b7-8e43-4006-ab25-734c23d34cf6","Type":"ContainerDied","Data":"8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db"} Mar 19 11:55:26.835005 master-0 kubenswrapper[7642]: I0319 11:55:26.834738 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7tj9" event={"ID":"6191b9b7-8e43-4006-ab25-734c23d34cf6","Type":"ContainerDied","Data":"49e6cb3af36a0772637e2b879d634836b906c3780dc54ff6831288e9854b02fe"} Mar 19 11:55:26.835005 master-0 kubenswrapper[7642]: I0319 11:55:26.834770 7642 scope.go:117] "RemoveContainer" containerID="8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db" Mar 19 11:55:26.835005 master-0 kubenswrapper[7642]: I0319 11:55:26.834958 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7tj9" Mar 19 11:55:26.844149 master-0 kubenswrapper[7642]: I0319 11:55:26.844094 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vj6x4" event={"ID":"aedae669-04fb-44e7-afac-03bd0bfae4e9","Type":"ContainerStarted","Data":"ba302a2152520867d35d857dfa783d08a5a5e0bd8bec8b04277e2db3211d98c6"} Mar 19 11:55:26.851977 master-0 kubenswrapper[7642]: I0319 11:55:26.851941 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tp448" event={"ID":"42edeefc-c920-4cd7-88b0-8c1e660abf63","Type":"ContainerDied","Data":"2a26373cf55f1a392f387b21b50d4f4a77a1dbdf1a7c1014bb10e807d1209774"} Mar 19 11:55:26.852136 master-0 kubenswrapper[7642]: I0319 11:55:26.852089 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tp448" Mar 19 11:55:26.855280 master-0 kubenswrapper[7642]: I0319 11:55:26.854905 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerStarted","Data":"ff2ebf3fa6730575b9447497330c1d66a64543dc66430125663573c61143b763"} Mar 19 11:55:26.856705 master-0 kubenswrapper[7642]: I0319 11:55:26.856669 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" event={"ID":"f65be54c-d851-4daa-93a0-8a3947da5e26","Type":"ContainerStarted","Data":"ea563cc47199ce68f6d63dcb8538eab7a3283fca0aa6f632680e63df560c6774"} Mar 19 11:55:26.859454 master-0 kubenswrapper[7642]: I0319 11:55:26.859405 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerStarted","Data":"f4990111c5770baec2a2ee082f291300cc7420fd0573bb028eff8f0a2e8424eb"} Mar 19 11:55:26.861881 master-0 kubenswrapper[7642]: I0319 11:55:26.861858 7642 scope.go:117] "RemoveContainer" containerID="bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac" Mar 19 11:55:26.862504 master-0 kubenswrapper[7642]: I0319 11:55:26.862474 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32/installer/0.log" Mar 19 11:55:26.862555 master-0 kubenswrapper[7642]: I0319 11:55:26.862520 7642 generic.go:334] "Generic (PLEG): container finished" podID="a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" containerID="52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841" exitCode=1 Mar 19 11:55:26.862603 master-0 kubenswrapper[7642]: I0319 11:55:26.862570 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32","Type":"ContainerDied","Data":"52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841"} Mar 19 11:55:26.862603 master-0 kubenswrapper[7642]: I0319 11:55:26.862591 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32","Type":"ContainerDied","Data":"81a2103b4ad6693138997ae75fcffbcb1e1a170a8ab8e624161ecf2a0c62868c"} Mar 19 11:55:26.862831 master-0 kubenswrapper[7642]: I0319 11:55:26.862801 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 19 11:55:26.871491 master-0 kubenswrapper[7642]: I0319 11:55:26.871434 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"344d526c0d5c832dab03c63929de32c376b56407a1d6deb8f5c0fba230eec603"} Mar 19 11:55:26.888810 master-0 kubenswrapper[7642]: I0319 11:55:26.888743 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"c0d8ad30-feb0-453f-b652-36aafc7b24f9","Type":"ContainerStarted","Data":"711f8121453bce45bc824e3d6883cc554f357db4612fdf30ce1a48bb144904ff"} Mar 19 11:55:26.894796 master-0 kubenswrapper[7642]: I0319 11:55:26.894761 7642 scope.go:117] "RemoveContainer" containerID="8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db" Mar 19 11:55:26.895211 master-0 kubenswrapper[7642]: E0319 11:55:26.895176 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db\": container with ID starting with 8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db not found: ID does not exist" containerID="8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db" Mar 19 11:55:26.895259 master-0 kubenswrapper[7642]: I0319 11:55:26.895212 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db"} err="failed to get container status \"8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db\": rpc error: code = NotFound desc = could not find container \"8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db\": container with ID starting with 8f76c85922ba9fdfe7092a3fe9b61209d01e94f67ebdbb275d599ef0c751c1db not found: ID does not exist" Mar 19 11:55:26.895259 master-0 kubenswrapper[7642]: I0319 11:55:26.895238 7642 scope.go:117] "RemoveContainer" containerID="bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac" Mar 19 11:55:26.896381 master-0 kubenswrapper[7642]: E0319 11:55:26.896319 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac\": container with ID starting with bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac not found: ID does not exist" containerID="bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac" Mar 19 11:55:26.896381 master-0 kubenswrapper[7642]: I0319 11:55:26.896355 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac"} err="failed to get container status \"bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac\": rpc error: code = NotFound desc = could not find container \"bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac\": container with ID starting with bc60345eab537ffbd42b9e47c41142f8998bb47cdd70eaa5144f2cc3e1d029ac not found: ID does not exist" Mar 19 11:55:26.896501 master-0 kubenswrapper[7642]: I0319 11:55:26.896383 7642 scope.go:117] "RemoveContainer" containerID="d283f45f04d7e8cff0f80d74c472dec27a65f0c7bc53da03218b4dd93a10eecd" Mar 19 11:55:26.926137 master-0 kubenswrapper[7642]: I0319 11:55:26.924921 7642 scope.go:117] "RemoveContainer" containerID="069b17cffc835d0f5bf79176129e2a69147a9ed67461773630bacc9148088890" Mar 19 11:55:26.952925 master-0 kubenswrapper[7642]: I0319 11:55:26.944927 7642 scope.go:117] "RemoveContainer" containerID="52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841" Mar 19 11:55:26.962997 master-0 kubenswrapper[7642]: I0319 11:55:26.962523 7642 scope.go:117] "RemoveContainer" containerID="52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841" Mar 19 11:55:26.963112 master-0 kubenswrapper[7642]: E0319 11:55:26.963090 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841\": container with ID starting with 52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841 not found: ID does not exist" containerID="52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841" Mar 19 11:55:26.963151 master-0 kubenswrapper[7642]: I0319 11:55:26.963124 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841"} err="failed to get container status \"52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841\": rpc error: code = NotFound desc = could not find container \"52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841\": container with ID starting with 52fa25c90f9b24dc767d1014b601b420ad5269f0e9f92b3f409823017bddd841 not found: ID does not exist" Mar 19 11:55:27.742660 master-0 kubenswrapper[7642]: I0319 11:55:27.742567 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:55:27.742660 master-0 kubenswrapper[7642]: I0319 11:55:27.742669 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:55:27.897266 master-0 kubenswrapper[7642]: I0319 11:55:27.897192 7642 generic.go:334] "Generic (PLEG): container finished" podID="24bd6a9b-9934-4cf9-b591-1ab892c8014c" containerID="7b4959df4babfe207d0b7474ab31deaecbea2e2b9d22f6c7f79a7c10a529fc14" exitCode=0 Mar 19 11:55:27.897547 master-0 kubenswrapper[7642]: I0319 11:55:27.897300 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerDied","Data":"7b4959df4babfe207d0b7474ab31deaecbea2e2b9d22f6c7f79a7c10a529fc14"} Mar 19 11:55:27.902886 master-0 kubenswrapper[7642]: I0319 11:55:27.902767 7642 generic.go:334] "Generic (PLEG): container finished" podID="d979be56-f948-4252-ab6b-9bfc48dc4187" containerID="ff2ebf3fa6730575b9447497330c1d66a64543dc66430125663573c61143b763" exitCode=0 Mar 19 11:55:27.902886 master-0 kubenswrapper[7642]: I0319 11:55:27.902843 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerDied","Data":"ff2ebf3fa6730575b9447497330c1d66a64543dc66430125663573c61143b763"} Mar 19 11:55:28.795739 master-0 kubenswrapper[7642]: I0319 11:55:28.795606 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vj6x4" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="registry-server" probeResult="failure" output=< Mar 19 11:55:28.795739 master-0 kubenswrapper[7642]: timeout: failed to connect service ":50051" within 1s Mar 19 11:55:28.795739 master-0 kubenswrapper[7642]: > Mar 19 11:55:28.920669 master-0 kubenswrapper[7642]: I0319 11:55:28.920527 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerStarted","Data":"95cccba8c9993fe9b1eac34b42b9d8d0ed1d42c6ac23c7711fd331a30c761a8d"} Mar 19 11:55:28.924834 master-0 kubenswrapper[7642]: I0319 11:55:28.924788 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerStarted","Data":"ffb9eefebe45bd6f6b90e5cfb48390a803f89852e6c65d7be29a92976dc945b7"} Mar 19 11:55:30.225823 master-0 kubenswrapper[7642]: I0319 11:55:30.224713 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:30.225823 master-0 kubenswrapper[7642]: I0319 11:55:30.224807 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:55:31.944355 master-0 kubenswrapper[7642]: I0319 11:55:31.944280 7642 generic.go:334] "Generic (PLEG): container finished" podID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerID="4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c" exitCode=0 Mar 19 11:55:31.944355 master-0 kubenswrapper[7642]: I0319 11:55:31.944330 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmnd7" event={"ID":"c5426417-d063-49e6-9fc9-f40ccf8296dc","Type":"ContainerDied","Data":"4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c"} Mar 19 11:55:33.959601 master-0 kubenswrapper[7642]: I0319 11:55:33.959508 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmnd7" event={"ID":"c5426417-d063-49e6-9fc9-f40ccf8296dc","Type":"ContainerStarted","Data":"d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168"} Mar 19 11:55:35.077347 master-0 kubenswrapper[7642]: I0319 11:55:35.077267 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:35.077969 master-0 kubenswrapper[7642]: I0319 11:55:35.077586 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:35.116568 master-0 kubenswrapper[7642]: I0319 11:55:35.116510 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:36.003937 master-0 kubenswrapper[7642]: I0319 11:55:36.003874 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 11:55:36.272840 master-0 kubenswrapper[7642]: I0319 11:55:36.272593 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:36.272840 master-0 kubenswrapper[7642]: I0319 11:55:36.272771 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:36.279753 master-0 kubenswrapper[7642]: E0319 11:55:36.279023 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:26Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:26Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:26Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:55:26Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f933312f49083e8746fc41ab5e46a9a757b448374f14971e256ebcb36f11dd97\\\"],\\\"sizeBytes\\\":470826739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:951ecfeba9b2da4b653034d09275f925396a79c2d8461b8a7c71c776fee67ba0\\\"],\\\"sizeBytes\\\":443272037},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:292560e2d80b460468bb19fe0ddf289767c655027b03a76ee6c40c91ffe4c483\\\"],\\\"sizeBytes\\\":438654374},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e66fd50be6f83ce321a566dfb76f3725b597374077d5af13813b928f6b1267e\\\"],\\\"sizeBytes\\\":411587146},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3a494212f1ba17f0f0980eef583218330eccb56eadf6b8cb0548c76d99b5014\\\"],\\\"sizeBytes\\\":407347125},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422\\\"],\\\"sizeBytes\\\":396521761}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:36.311090 master-0 kubenswrapper[7642]: I0319 11:55:36.311009 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:37.011143 master-0 kubenswrapper[7642]: I0319 11:55:37.011081 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 11:55:37.811295 master-0 kubenswrapper[7642]: I0319 11:55:37.811221 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:55:37.850739 master-0 kubenswrapper[7642]: I0319 11:55:37.850653 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:55:38.752842 master-0 kubenswrapper[7642]: I0319 11:55:38.752755 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:55:38.753376 master-0 kubenswrapper[7642]: I0319 11:55:38.753300 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:55:38.808334 master-0 kubenswrapper[7642]: I0319 11:55:38.808268 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:55:39.028849 master-0 kubenswrapper[7642]: I0319 11:55:39.028672 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:55:39.171074 master-0 kubenswrapper[7642]: E0319 11:55:39.171001 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:55:39.171781 master-0 kubenswrapper[7642]: I0319 11:55:39.171757 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 11:55:39.993386 master-0 kubenswrapper[7642]: I0319 11:55:39.993297 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"62c4c366151fc81e1f60e282bf3fbb779142096d1ad01161d17a9c6c39754bbd"} Mar 19 11:55:40.225297 master-0 kubenswrapper[7642]: I0319 11:55:40.225123 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:55:40.225999 master-0 kubenswrapper[7642]: I0319 11:55:40.225339 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:41.008818 master-0 kubenswrapper[7642]: I0319 11:55:41.008661 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac" exitCode=0 Mar 19 11:55:41.008818 master-0 kubenswrapper[7642]: I0319 11:55:41.008753 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac"} Mar 19 11:55:41.012393 master-0 kubenswrapper[7642]: I0319 11:55:41.012332 7642 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="a751b53c76eb83162f3d88044f06fc209b871d360365f0332303ca5b041a66eb" exitCode=1 Mar 19 11:55:41.012529 master-0 kubenswrapper[7642]: I0319 11:55:41.012460 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"a751b53c76eb83162f3d88044f06fc209b871d360365f0332303ca5b041a66eb"} Mar 19 11:55:41.013217 master-0 kubenswrapper[7642]: I0319 11:55:41.013175 7642 scope.go:117] "RemoveContainer" containerID="a751b53c76eb83162f3d88044f06fc209b871d360365f0332303ca5b041a66eb" Mar 19 11:55:41.327375 master-0 kubenswrapper[7642]: I0319 11:55:41.327227 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:55:42.021273 master-0 kubenswrapper[7642]: I0319 11:55:42.021163 7642 generic.go:334] "Generic (PLEG): container finished" podID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerID="825551a6cbd6e2d9a62ce681600e2897018550eae2ad6b2a4c579d2b878da6e9" exitCode=0 Mar 19 11:55:42.021273 master-0 kubenswrapper[7642]: I0319 11:55:42.021223 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"01529a13-e1b1-40a1-9fff-8a20f1cba68c","Type":"ContainerDied","Data":"825551a6cbd6e2d9a62ce681600e2897018550eae2ad6b2a4c579d2b878da6e9"} Mar 19 11:55:43.030140 master-0 kubenswrapper[7642]: I0319 11:55:43.030060 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"0e092e68e51dc8d746441712e9315d094b81fce2d278dfd7bbb413252f403787"} Mar 19 11:55:43.346408 master-0 kubenswrapper[7642]: I0319 11:55:43.346352 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:55:43.523744 master-0 kubenswrapper[7642]: I0319 11:55:43.523628 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-var-lock\") pod \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " Mar 19 11:55:43.524289 master-0 kubenswrapper[7642]: I0319 11:55:43.524261 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kubelet-dir\") pod \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " Mar 19 11:55:43.524489 master-0 kubenswrapper[7642]: I0319 11:55:43.524464 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kube-api-access\") pod \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\" (UID: \"01529a13-e1b1-40a1-9fff-8a20f1cba68c\") " Mar 19 11:55:43.525708 master-0 kubenswrapper[7642]: I0319 11:55:43.525556 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-var-lock" (OuterVolumeSpecName: "var-lock") pod "01529a13-e1b1-40a1-9fff-8a20f1cba68c" (UID: "01529a13-e1b1-40a1-9fff-8a20f1cba68c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:43.525868 master-0 kubenswrapper[7642]: I0319 11:55:43.525730 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "01529a13-e1b1-40a1-9fff-8a20f1cba68c" (UID: "01529a13-e1b1-40a1-9fff-8a20f1cba68c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:43.528354 master-0 kubenswrapper[7642]: I0319 11:55:43.528317 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "01529a13-e1b1-40a1-9fff-8a20f1cba68c" (UID: "01529a13-e1b1-40a1-9fff-8a20f1cba68c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:55:43.626703 master-0 kubenswrapper[7642]: I0319 11:55:43.626516 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:43.626703 master-0 kubenswrapper[7642]: I0319 11:55:43.626596 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:43.626703 master-0 kubenswrapper[7642]: I0319 11:55:43.626627 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/01529a13-e1b1-40a1-9fff-8a20f1cba68c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:44.046205 master-0 kubenswrapper[7642]: I0319 11:55:44.046102 7642 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="79f33a58d544fe5d9dca133c52ef1b1ffeca07d3925205b6553e47537ece7b51" exitCode=1 Mar 19 11:55:44.046205 master-0 kubenswrapper[7642]: I0319 11:55:44.046181 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerDied","Data":"79f33a58d544fe5d9dca133c52ef1b1ffeca07d3925205b6553e47537ece7b51"} Mar 19 11:55:44.047039 master-0 kubenswrapper[7642]: I0319 11:55:44.046658 7642 scope.go:117] "RemoveContainer" containerID="79f33a58d544fe5d9dca133c52ef1b1ffeca07d3925205b6553e47537ece7b51" Mar 19 11:55:44.047907 master-0 kubenswrapper[7642]: I0319 11:55:44.047874 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 11:55:44.048065 master-0 kubenswrapper[7642]: I0319 11:55:44.047875 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"01529a13-e1b1-40a1-9fff-8a20f1cba68c","Type":"ContainerDied","Data":"9c2f6ee2f4f5a8055122de400fd9243193e98b0769b1970bf4b2bfb820983158"} Mar 19 11:55:44.048065 master-0 kubenswrapper[7642]: I0319 11:55:44.048059 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c2f6ee2f4f5a8055122de400fd9243193e98b0769b1970bf4b2bfb820983158" Mar 19 11:55:45.056943 master-0 kubenswrapper[7642]: I0319 11:55:45.056871 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"c83737980b9ee109184b1d78e942cf36","Type":"ContainerStarted","Data":"9704f6068bbc3b7071c181549835a51a1db8ec178675b82fbcbeb90ed4c62c0b"} Mar 19 11:55:45.080226 master-0 kubenswrapper[7642]: E0319 11:55:45.080081 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:46.280048 master-0 kubenswrapper[7642]: E0319 11:55:46.279845 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:49.561255 master-0 kubenswrapper[7642]: I0319 11:55:49.561132 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:55:50.226333 master-0 kubenswrapper[7642]: I0319 11:55:50.226255 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:55:50.226333 master-0 kubenswrapper[7642]: I0319 11:55:50.226336 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:51.326536 master-0 kubenswrapper[7642]: I0319 11:55:51.326486 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:55:52.562111 master-0 kubenswrapper[7642]: I0319 11:55:52.562014 7642 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:54.016159 master-0 kubenswrapper[7642]: E0319 11:55:54.016035 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:55:54.118859 master-0 kubenswrapper[7642]: I0319 11:55:54.118790 7642 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e" exitCode=0 Mar 19 11:55:55.081749 master-0 kubenswrapper[7642]: E0319 11:55:55.081340 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:55.127131 master-0 kubenswrapper[7642]: I0319 11:55:55.127065 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6" exitCode=0 Mar 19 11:55:55.127131 master-0 kubenswrapper[7642]: I0319 11:55:55.127124 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6"} Mar 19 11:55:56.280324 master-0 kubenswrapper[7642]: E0319 11:55:56.280190 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:55:56.427246 master-0 kubenswrapper[7642]: I0319 11:55:56.427198 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 11:55:56.427456 master-0 kubenswrapper[7642]: I0319 11:55:56.427312 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:55:56.607422 master-0 kubenswrapper[7642]: I0319 11:55:56.607304 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 11:55:56.607422 master-0 kubenswrapper[7642]: I0319 11:55:56.607443 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") pod \"d664a6d0d2a24360dee10612610f1b59\" (UID: \"d664a6d0d2a24360dee10612610f1b59\") " Mar 19 11:55:56.607991 master-0 kubenswrapper[7642]: I0319 11:55:56.607469 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir" (OuterVolumeSpecName: "data-dir") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:56.607991 master-0 kubenswrapper[7642]: I0319 11:55:56.607504 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs" (OuterVolumeSpecName: "certs") pod "d664a6d0d2a24360dee10612610f1b59" (UID: "d664a6d0d2a24360dee10612610f1b59"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:55:56.607991 master-0 kubenswrapper[7642]: I0319 11:55:56.607810 7642 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:56.607991 master-0 kubenswrapper[7642]: I0319 11:55:56.607834 7642 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/d664a6d0d2a24360dee10612610f1b59-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:55:57.148255 master-0 kubenswrapper[7642]: I0319 11:55:57.148148 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-ptjr9_84e5577e-df8b-46a2-aff7-0bf90b5010d9/openshift-apiserver-operator/1.log" Mar 19 11:55:57.149159 master-0 kubenswrapper[7642]: I0319 11:55:57.149052 7642 generic.go:334] "Generic (PLEG): container finished" podID="84e5577e-df8b-46a2-aff7-0bf90b5010d9" containerID="6ec837485683b9ef9f8903f02c33b71f9f384c302a364201ed3b85fde2544f12" exitCode=255 Mar 19 11:55:57.149328 master-0 kubenswrapper[7642]: I0319 11:55:57.149163 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerDied","Data":"6ec837485683b9ef9f8903f02c33b71f9f384c302a364201ed3b85fde2544f12"} Mar 19 11:55:57.149328 master-0 kubenswrapper[7642]: I0319 11:55:57.149228 7642 scope.go:117] "RemoveContainer" containerID="dd2920bde9ceef973c55e7d085b8480289be3211f9caba169d896a74981242f4" Mar 19 11:55:57.150077 master-0 kubenswrapper[7642]: I0319 11:55:57.149978 7642 scope.go:117] "RemoveContainer" containerID="6ec837485683b9ef9f8903f02c33b71f9f384c302a364201ed3b85fde2544f12" Mar 19 11:55:57.150430 master-0 kubenswrapper[7642]: E0319 11:55:57.150369 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-d65958b8-ptjr9_openshift-apiserver-operator(84e5577e-df8b-46a2-aff7-0bf90b5010d9)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" podUID="84e5577e-df8b-46a2-aff7-0bf90b5010d9" Mar 19 11:55:57.153464 master-0 kubenswrapper[7642]: I0319 11:55:57.153398 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_d664a6d0d2a24360dee10612610f1b59/etcdctl/0.log" Mar 19 11:55:57.153581 master-0 kubenswrapper[7642]: I0319 11:55:57.153473 7642 generic.go:334] "Generic (PLEG): container finished" podID="d664a6d0d2a24360dee10612610f1b59" containerID="17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab" exitCode=137 Mar 19 11:55:57.153726 master-0 kubenswrapper[7642]: I0319 11:55:57.153625 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:55:57.156025 master-0 kubenswrapper[7642]: I0319 11:55:57.155977 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-8xkv5_ac946049-4b93-4d8c-bfa6-eacf83160ed1/kube-controller-manager-operator/1.log" Mar 19 11:55:57.156456 master-0 kubenswrapper[7642]: I0319 11:55:57.156408 7642 generic.go:334] "Generic (PLEG): container finished" podID="ac946049-4b93-4d8c-bfa6-eacf83160ed1" containerID="8599681efde4807c1ee51fb15ce18a51bf6eae443d3b45fdd6b7cd611d18283f" exitCode=255 Mar 19 11:55:57.156456 master-0 kubenswrapper[7642]: I0319 11:55:57.156441 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerDied","Data":"8599681efde4807c1ee51fb15ce18a51bf6eae443d3b45fdd6b7cd611d18283f"} Mar 19 11:55:57.156887 master-0 kubenswrapper[7642]: I0319 11:55:57.156778 7642 scope.go:117] "RemoveContainer" containerID="8599681efde4807c1ee51fb15ce18a51bf6eae443d3b45fdd6b7cd611d18283f" Mar 19 11:55:57.157192 master-0 kubenswrapper[7642]: E0319 11:55:57.156981 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-ff989d6cc-8xkv5_openshift-kube-controller-manager-operator(ac946049-4b93-4d8c-bfa6-eacf83160ed1)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" podUID="ac946049-4b93-4d8c-bfa6-eacf83160ed1" Mar 19 11:55:57.178582 master-0 kubenswrapper[7642]: I0319 11:55:57.178509 7642 scope.go:117] "RemoveContainer" containerID="feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e" Mar 19 11:55:57.200477 master-0 kubenswrapper[7642]: I0319 11:55:57.200421 7642 scope.go:117] "RemoveContainer" containerID="17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab" Mar 19 11:55:57.219291 master-0 kubenswrapper[7642]: I0319 11:55:57.219242 7642 scope.go:117] "RemoveContainer" containerID="feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e" Mar 19 11:55:57.221778 master-0 kubenswrapper[7642]: E0319 11:55:57.221747 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e\": container with ID starting with feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e not found: ID does not exist" containerID="feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e" Mar 19 11:55:57.222062 master-0 kubenswrapper[7642]: I0319 11:55:57.222017 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e"} err="failed to get container status \"feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e\": rpc error: code = NotFound desc = could not find container \"feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e\": container with ID starting with feab031ef06b1348400b66ce4e2405b92f2cf5f03b9f55cec3ed7cd96c282d9e not found: ID does not exist" Mar 19 11:55:57.222178 master-0 kubenswrapper[7642]: I0319 11:55:57.222162 7642 scope.go:117] "RemoveContainer" containerID="17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab" Mar 19 11:55:57.223340 master-0 kubenswrapper[7642]: E0319 11:55:57.223286 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab\": container with ID starting with 17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab not found: ID does not exist" containerID="17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab" Mar 19 11:55:57.223430 master-0 kubenswrapper[7642]: I0319 11:55:57.223341 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab"} err="failed to get container status \"17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab\": rpc error: code = NotFound desc = could not find container \"17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab\": container with ID starting with 17bdfd8c32d2168a0f64a492cdad09468851d14d3ef7fda29cfbe113086490ab not found: ID does not exist" Mar 19 11:55:57.223430 master-0 kubenswrapper[7642]: I0319 11:55:57.223385 7642 scope.go:117] "RemoveContainer" containerID="d611dd027fd146af69473e5540a6fda94c8c6de39cd053174307e17bd337e0ce" Mar 19 11:55:57.243883 master-0 kubenswrapper[7642]: I0319 11:55:57.243808 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 11:55:58.077967 master-0 kubenswrapper[7642]: I0319 11:55:58.077898 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d664a6d0d2a24360dee10612610f1b59" path="/var/lib/kubelet/pods/d664a6d0d2a24360dee10612610f1b59/volumes" Mar 19 11:55:58.078676 master-0 kubenswrapper[7642]: I0319 11:55:58.078538 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:55:58.164337 master-0 kubenswrapper[7642]: I0319 11:55:58.164283 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-ptjr9_84e5577e-df8b-46a2-aff7-0bf90b5010d9/openshift-apiserver-operator/1.log" Mar 19 11:55:58.168865 master-0 kubenswrapper[7642]: I0319 11:55:58.168841 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-8xkv5_ac946049-4b93-4d8c-bfa6-eacf83160ed1/kube-controller-manager-operator/1.log" Mar 19 11:56:00.189341 master-0 kubenswrapper[7642]: E0319 11:56:00.189161 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{apiserver-5c7d7c6bb8-gnz2z.189e3c0c2acd5580 openshift-oauth-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-oauth-apiserver,Name:apiserver-5c7d7c6bb8-gnz2z,UID:66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e,APIVersion:v1,ResourceVersion:7547,FieldPath:spec.containers{oauth-apiserver},},Reason:Created,Message:Created container: oauth-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:55:26.185928064 +0000 UTC m=+84.303368851,LastTimestamp:2026-03-19 11:55:26.185928064 +0000 UTC m=+84.303368851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:56:00.227068 master-0 kubenswrapper[7642]: I0319 11:56:00.226955 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": context deadline exceeded" start-of-body= Mar 19 11:56:00.227448 master-0 kubenswrapper[7642]: I0319 11:56:00.227081 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": context deadline exceeded" Mar 19 11:56:01.186105 master-0 kubenswrapper[7642]: I0319 11:56:01.186002 7642 generic.go:334] "Generic (PLEG): container finished" podID="6d017409-a362-4e58-87af-d02b3834fdd4" containerID="57848ee43bd88e9ed1ccc7c15c3a6279821a813bcb2340426181dccb4cafcc2a" exitCode=0 Mar 19 11:56:02.561749 master-0 kubenswrapper[7642]: I0319 11:56:02.561618 7642 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:05.082102 master-0 kubenswrapper[7642]: E0319 11:56:05.082016 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:06.217701 master-0 kubenswrapper[7642]: I0319 11:56:06.217613 7642 generic.go:334] "Generic (PLEG): container finished" podID="bea42722-d7af-4938-9f3a-da57bd056f96" containerID="15fc833de80b2cae87e3ef1229c5aae30a9051318a9e0831bf339fc664c1dd1c" exitCode=0 Mar 19 11:56:06.281607 master-0 kubenswrapper[7642]: E0319 11:56:06.281279 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:07.814325 master-0 kubenswrapper[7642]: I0319 11:56:07.814247 7642 patch_prober.go:28] interesting pod/etcd-operator-8544cbcf9c-bj75g container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.22:8443/healthz\": dial tcp 10.128.0.22:8443: connect: connection refused" start-of-body= Mar 19 11:56:07.815827 master-0 kubenswrapper[7642]: I0319 11:56:07.815761 7642 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" podUID="aab47a39-a18b-4350-b141-4d5de2d8e96f" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.22:8443/healthz\": dial tcp 10.128.0.22:8443: connect: connection refused" Mar 19 11:56:08.134615 master-0 kubenswrapper[7642]: E0319 11:56:08.134469 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:09.241165 master-0 kubenswrapper[7642]: I0319 11:56:09.241113 7642 generic.go:334] "Generic (PLEG): container finished" podID="367ea4e8-8172-4730-8f26-499ed7c167bb" containerID="ec872b5060e715523dcf7612ce03c05df430a828612dcdbf8f83f43994a2ae84" exitCode=0 Mar 19 11:56:09.245740 master-0 kubenswrapper[7642]: I0319 11:56:09.245679 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e" exitCode=0 Mar 19 11:56:10.227874 master-0 kubenswrapper[7642]: I0319 11:56:10.227766 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:56:10.228166 master-0 kubenswrapper[7642]: I0319 11:56:10.227897 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:12.265787 master-0 kubenswrapper[7642]: I0319 11:56:12.265691 7642 generic.go:334] "Generic (PLEG): container finished" podID="aab47a39-a18b-4350-b141-4d5de2d8e96f" containerID="7ed370adba82a4b171cad234b3d380f3d393808daf0a0a8fd7c7e46ac33f3ee6" exitCode=0 Mar 19 11:56:12.562044 master-0 kubenswrapper[7642]: I0319 11:56:12.561941 7642 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:15.083356 master-0 kubenswrapper[7642]: E0319 11:56:15.083248 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:16.282249 master-0 kubenswrapper[7642]: E0319 11:56:16.282131 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:16.282249 master-0 kubenswrapper[7642]: E0319 11:56:16.282203 7642 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 11:56:17.297665 master-0 kubenswrapper[7642]: I0319 11:56:17.297561 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/0.log" Mar 19 11:56:17.298609 master-0 kubenswrapper[7642]: I0319 11:56:17.298549 7642 generic.go:334] "Generic (PLEG): container finished" podID="51e2c1d1-a50f-4a0d-8273-440e699b3907" containerID="52b485f2005783105cdf9d2569a5312d319a7c33f7fd721f2c4707505569003a" exitCode=1 Mar 19 11:56:20.229253 master-0 kubenswrapper[7642]: I0319 11:56:20.229135 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:56:20.229253 master-0 kubenswrapper[7642]: I0319 11:56:20.229244 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:22.252703 master-0 kubenswrapper[7642]: E0319 11:56:22.252606 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:25.084669 master-0 kubenswrapper[7642]: E0319 11:56:25.084552 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:25.085720 master-0 kubenswrapper[7642]: I0319 11:56:25.085086 7642 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 11:56:26.154179 master-0 kubenswrapper[7642]: I0319 11:56:26.154099 7642 status_manager.go:851] "Failed to get status for pod" podUID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" pod="openshift-kube-scheduler/installer-3-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-3-master-0)" Mar 19 11:56:26.370979 master-0 kubenswrapper[7642]: I0319 11:56:26.370926 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-5qtx7_b0133cb5-e425-40af-a051-71b2362a89c3/network-operator/1.log" Mar 19 11:56:26.371877 master-0 kubenswrapper[7642]: I0319 11:56:26.371809 7642 generic.go:334] "Generic (PLEG): container finished" podID="b0133cb5-e425-40af-a051-71b2362a89c3" containerID="0cc3ae6879b287fcc4bae5ddccb1f2e7e024042e8efb6dfd62d69b2efa286229" exitCode=255 Mar 19 11:56:26.734464 master-0 kubenswrapper[7642]: I0319 11:56:26.734367 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": read tcp 10.128.0.2:58652->10.128.0.56:8443: read: connection reset by peer" start-of-body= Mar 19 11:56:26.734760 master-0 kubenswrapper[7642]: I0319 11:56:26.734484 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": read tcp 10.128.0.2:58652->10.128.0.56:8443: read: connection reset by peer" Mar 19 11:56:26.735836 master-0 kubenswrapper[7642]: I0319 11:56:26.735290 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": dial tcp 10.128.0.56:8443: connect: connection refused" start-of-body= Mar 19 11:56:26.735836 master-0 kubenswrapper[7642]: I0319 11:56:26.735422 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": dial tcp 10.128.0.56:8443: connect: connection refused" Mar 19 11:56:27.379933 master-0 kubenswrapper[7642]: I0319 11:56:27.379691 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5c7d7c6bb8-gnz2z_66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/oauth-apiserver/0.log" Mar 19 11:56:27.381187 master-0 kubenswrapper[7642]: I0319 11:56:27.380120 7642 generic.go:334] "Generic (PLEG): container finished" podID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerID="f4990111c5770baec2a2ee082f291300cc7420fd0573bb028eff8f0a2e8424eb" exitCode=1 Mar 19 11:56:30.224411 master-0 kubenswrapper[7642]: I0319 11:56:30.224323 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": dial tcp 10.128.0.56:8443: connect: connection refused" start-of-body= Mar 19 11:56:30.224411 master-0 kubenswrapper[7642]: I0319 11:56:30.224400 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": dial tcp 10.128.0.56:8443: connect: connection refused" Mar 19 11:56:32.082743 master-0 kubenswrapper[7642]: E0319 11:56:32.082468 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 19 11:56:32.083561 master-0 kubenswrapper[7642]: E0319 11:56:32.082787 7642 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Mar 19 11:56:32.083561 master-0 kubenswrapper[7642]: I0319 11:56:32.082861 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:56:32.083933 master-0 kubenswrapper[7642]: I0319 11:56:32.083881 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"0e092e68e51dc8d746441712e9315d094b81fce2d278dfd7bbb413252f403787"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 11:56:32.084053 master-0 kubenswrapper[7642]: I0319 11:56:32.084014 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://0e092e68e51dc8d746441712e9315d094b81fce2d278dfd7bbb413252f403787" gracePeriod=30 Mar 19 11:56:32.094936 master-0 kubenswrapper[7642]: I0319 11:56:32.092941 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:56:32.425922 master-0 kubenswrapper[7642]: I0319 11:56:32.425799 7642 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="0e092e68e51dc8d746441712e9315d094b81fce2d278dfd7bbb413252f403787" exitCode=2 Mar 19 11:56:34.192954 master-0 kubenswrapper[7642]: E0319 11:56:34.192364 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{apiserver-5c7d7c6bb8-gnz2z.189e3c0c2baa2b4d openshift-oauth-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-oauth-apiserver,Name:apiserver-5c7d7c6bb8-gnz2z,UID:66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e,APIVersion:v1,ResourceVersion:7547,FieldPath:spec.containers{oauth-apiserver},},Reason:Started,Message:Started container oauth-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 11:55:26.200400717 +0000 UTC m=+84.317841504,LastTimestamp:2026-03-19 11:55:26.200400717 +0000 UTC m=+84.317841504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 11:56:34.731463 master-0 kubenswrapper[7642]: E0319 11:56:34.731366 7642 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.648s" Mar 19 11:56:34.732722 master-0 kubenswrapper[7642]: I0319 11:56:34.732669 7642 scope.go:117] "RemoveContainer" containerID="8599681efde4807c1ee51fb15ce18a51bf6eae443d3b45fdd6b7cd611d18283f" Mar 19 11:56:34.732841 master-0 kubenswrapper[7642]: I0319 11:56:34.732741 7642 scope.go:117] "RemoveContainer" containerID="6ec837485683b9ef9f8903f02c33b71f9f384c302a364201ed3b85fde2544f12" Mar 19 11:56:34.734112 master-0 kubenswrapper[7642]: I0319 11:56:34.733595 7642 scope.go:117] "RemoveContainer" containerID="57848ee43bd88e9ed1ccc7c15c3a6279821a813bcb2340426181dccb4cafcc2a" Mar 19 11:56:34.735784 master-0 kubenswrapper[7642]: I0319 11:56:34.735751 7642 scope.go:117] "RemoveContainer" containerID="52b485f2005783105cdf9d2569a5312d319a7c33f7fd721f2c4707505569003a" Mar 19 11:56:34.757853 master-0 kubenswrapper[7642]: I0319 11:56:34.757786 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769170 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerDied","Data":"57848ee43bd88e9ed1ccc7c15c3a6279821a813bcb2340426181dccb4cafcc2a"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769233 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerDied","Data":"15fc833de80b2cae87e3ef1229c5aae30a9051318a9e0831bf339fc664c1dd1c"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769247 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerDied","Data":"ec872b5060e715523dcf7612ce03c05df430a828612dcdbf8f83f43994a2ae84"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769258 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerDied","Data":"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769272 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerDied","Data":"7ed370adba82a4b171cad234b3d380f3d393808daf0a0a8fd7c7e46ac33f3ee6"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769287 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerDied","Data":"52b485f2005783105cdf9d2569a5312d319a7c33f7fd721f2c4707505569003a"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769297 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769306 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769314 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769322 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769331 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"24b4ed170d527099878cb5fdd508a2fb","Type":"ContainerStarted","Data":"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769340 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerDied","Data":"0cc3ae6879b287fcc4bae5ddccb1f2e7e024042e8efb6dfd62d69b2efa286229"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769354 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerDied","Data":"f4990111c5770baec2a2ee082f291300cc7420fd0573bb028eff8f0a2e8424eb"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769365 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerDied","Data":"0e092e68e51dc8d746441712e9315d094b81fce2d278dfd7bbb413252f403787"} Mar 19 11:56:34.770680 master-0 kubenswrapper[7642]: I0319 11:56:34.769379 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"46f265536aba6292ead501bc9b49f327","Type":"ContainerStarted","Data":"78b8de87e8a5c898566cfde9569af5051ed935aae3421d721d98339159c91a61"} Mar 19 11:56:34.771374 master-0 kubenswrapper[7642]: I0319 11:56:34.770903 7642 scope.go:117] "RemoveContainer" containerID="15fc833de80b2cae87e3ef1229c5aae30a9051318a9e0831bf339fc664c1dd1c" Mar 19 11:56:34.771374 master-0 kubenswrapper[7642]: I0319 11:56:34.771349 7642 scope.go:117] "RemoveContainer" containerID="7ed370adba82a4b171cad234b3d380f3d393808daf0a0a8fd7c7e46ac33f3ee6" Mar 19 11:56:34.771576 master-0 kubenswrapper[7642]: I0319 11:56:34.771551 7642 scope.go:117] "RemoveContainer" containerID="f4990111c5770baec2a2ee082f291300cc7420fd0573bb028eff8f0a2e8424eb" Mar 19 11:56:34.771862 master-0 kubenswrapper[7642]: I0319 11:56:34.771792 7642 scope.go:117] "RemoveContainer" containerID="0cc3ae6879b287fcc4bae5ddccb1f2e7e024042e8efb6dfd62d69b2efa286229" Mar 19 11:56:34.771980 master-0 kubenswrapper[7642]: E0319 11:56:34.771947 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=network-operator pod=network-operator-7bd846bfc4-5qtx7_openshift-network-operator(b0133cb5-e425-40af-a051-71b2362a89c3)\"" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" podUID="b0133cb5-e425-40af-a051-71b2362a89c3" Mar 19 11:56:34.772031 master-0 kubenswrapper[7642]: I0319 11:56:34.771991 7642 scope.go:117] "RemoveContainer" containerID="daecc00cd0269edf3ea7180e1cd52281d3c5b230d8b9db4f7b6d733857fc8257" Mar 19 11:56:34.772031 master-0 kubenswrapper[7642]: I0319 11:56:34.772024 7642 scope.go:117] "RemoveContainer" containerID="ec872b5060e715523dcf7612ce03c05df430a828612dcdbf8f83f43994a2ae84" Mar 19 11:56:34.795048 master-0 kubenswrapper[7642]: I0319 11:56:34.793021 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:56:34.795048 master-0 kubenswrapper[7642]: I0319 11:56:34.793068 7642 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="187e2b2f-81ca-474f-a66d-ab3017134c21" Mar 19 11:56:34.802682 master-0 kubenswrapper[7642]: I0319 11:56:34.799842 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 19 11:56:34.802682 master-0 kubenswrapper[7642]: I0319 11:56:34.799897 7642 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="187e2b2f-81ca-474f-a66d-ab3017134c21" Mar 19 11:56:34.802682 master-0 kubenswrapper[7642]: I0319 11:56:34.800098 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" podStartSLOduration=69.800077734 podStartE2EDuration="1m9.800077734s" podCreationTimestamp="2026-03-19 11:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:56:34.756158076 +0000 UTC m=+152.873598873" watchObservedRunningTime="2026-03-19 11:56:34.800077734 +0000 UTC m=+152.917518521" Mar 19 11:56:34.802682 master-0 kubenswrapper[7642]: I0319 11:56:34.802479 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pgl92" podStartSLOduration=90.872220914 podStartE2EDuration="1m40.802472355s" podCreationTimestamp="2026-03-19 11:54:54 +0000 UTC" firstStartedPulling="2026-03-19 11:54:56.071358575 +0000 UTC m=+54.188799372" lastFinishedPulling="2026-03-19 11:55:06.001610026 +0000 UTC m=+64.119050813" observedRunningTime="2026-03-19 11:56:34.793157221 +0000 UTC m=+152.910598018" watchObservedRunningTime="2026-03-19 11:56:34.802472355 +0000 UTC m=+152.919913142" Mar 19 11:56:34.823017 master-0 kubenswrapper[7642]: I0319 11:56:34.822964 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r"] Mar 19 11:56:34.837676 master-0 kubenswrapper[7642]: I0319 11:56:34.835225 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f69766c59-b588r"] Mar 19 11:56:34.971975 master-0 kubenswrapper[7642]: I0319 11:56:34.968747 7642 scope.go:117] "RemoveContainer" containerID="a751b53c76eb83162f3d88044f06fc209b871d360365f0332303ca5b041a66eb" Mar 19 11:56:35.064550 master-0 kubenswrapper[7642]: I0319 11:56:35.064371 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:56:35.070228 master-0 kubenswrapper[7642]: I0319 11:56:35.070182 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 19 11:56:35.087111 master-0 kubenswrapper[7642]: E0319 11:56:35.087061 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="200ms" Mar 19 11:56:35.210606 master-0 kubenswrapper[7642]: I0319 11:56:35.210521 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cmnd7" podStartSLOduration=89.395822271 podStartE2EDuration="1m37.210495171s" podCreationTimestamp="2026-03-19 11:54:58 +0000 UTC" firstStartedPulling="2026-03-19 11:55:25.606048047 +0000 UTC m=+83.723488834" lastFinishedPulling="2026-03-19 11:55:33.420720947 +0000 UTC m=+91.538161734" observedRunningTime="2026-03-19 11:56:35.208610145 +0000 UTC m=+153.326050942" watchObservedRunningTime="2026-03-19 11:56:35.210495171 +0000 UTC m=+153.327935958" Mar 19 11:56:35.223882 master-0 kubenswrapper[7642]: I0319 11:56:35.223836 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:56:35.237084 master-0 kubenswrapper[7642]: I0319 11:56:35.236927 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ql8d" podStartSLOduration=88.397344364 podStartE2EDuration="1m31.236892795s" podCreationTimestamp="2026-03-19 11:55:04 +0000 UTC" firstStartedPulling="2026-03-19 11:55:25.606103318 +0000 UTC m=+83.723544105" lastFinishedPulling="2026-03-19 11:55:28.445651749 +0000 UTC m=+86.563092536" observedRunningTime="2026-03-19 11:56:35.235070401 +0000 UTC m=+153.352511188" watchObservedRunningTime="2026-03-19 11:56:35.236892795 +0000 UTC m=+153.354333582" Mar 19 11:56:35.293300 master-0 kubenswrapper[7642]: I0319 11:56:35.293207 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vj6x4" podStartSLOduration=70.763851253 podStartE2EDuration="1m38.293181926s" podCreationTimestamp="2026-03-19 11:54:57 +0000 UTC" firstStartedPulling="2026-03-19 11:54:59.047611036 +0000 UTC m=+57.165051823" lastFinishedPulling="2026-03-19 11:55:26.576941689 +0000 UTC m=+84.694382496" observedRunningTime="2026-03-19 11:56:35.292112514 +0000 UTC m=+153.409553301" watchObservedRunningTime="2026-03-19 11:56:35.293181926 +0000 UTC m=+153.410622713" Mar 19 11:56:35.427541 master-0 kubenswrapper[7642]: I0319 11:56:35.425832 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7tj9"] Mar 19 11:56:35.430146 master-0 kubenswrapper[7642]: I0319 11:56:35.430086 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7tj9"] Mar 19 11:56:35.454364 master-0 kubenswrapper[7642]: I0319 11:56:35.454254 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-8xkv5_ac946049-4b93-4d8c-bfa6-eacf83160ed1/kube-controller-manager-operator/1.log" Mar 19 11:56:35.454683 master-0 kubenswrapper[7642]: I0319 11:56:35.454414 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"c8521feb576750f4b4cf92dbf1c496a19af50a44e812267935ec3c28f71bb348"} Mar 19 11:56:35.472584 master-0 kubenswrapper[7642]: I0319 11:56:35.464090 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-5qtx7_b0133cb5-e425-40af-a051-71b2362a89c3/network-operator/1.log" Mar 19 11:56:35.472584 master-0 kubenswrapper[7642]: I0319 11:56:35.467844 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tp448"] Mar 19 11:56:35.472584 master-0 kubenswrapper[7642]: I0319 11:56:35.467884 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerStarted","Data":"016ff43860287ef9ea9c7834d6813f132931ea72467cbcc206c5b8bf5a2a1e8b"} Mar 19 11:56:35.472584 master-0 kubenswrapper[7642]: I0319 11:56:35.469908 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerStarted","Data":"3385d26311ce81e00023dfaf9b5057451bdc1320414436c96c2f92229792ca5f"} Mar 19 11:56:35.472584 master-0 kubenswrapper[7642]: I0319 11:56:35.470667 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tp448"] Mar 19 11:56:35.475383 master-0 kubenswrapper[7642]: I0319 11:56:35.475346 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/0.log" Mar 19 11:56:35.476285 master-0 kubenswrapper[7642]: I0319 11:56:35.476013 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef"} Mar 19 11:56:35.492680 master-0 kubenswrapper[7642]: I0319 11:56:35.492577 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:56:35.494293 master-0 kubenswrapper[7642]: I0319 11:56:35.494240 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 19 11:56:35.496726 master-0 kubenswrapper[7642]: I0319 11:56:35.496694 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5c7d7c6bb8-gnz2z_66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/oauth-apiserver/0.log" Mar 19 11:56:35.497124 master-0 kubenswrapper[7642]: I0319 11:56:35.497095 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerStarted","Data":"5715f8e1d312ac24abd3853bc91bc1dc47d69fdf6e3f8c0fa99413c9a22da64c"} Mar 19 11:56:35.507771 master-0 kubenswrapper[7642]: I0319 11:56:35.506243 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-ptjr9_84e5577e-df8b-46a2-aff7-0bf90b5010d9/openshift-apiserver-operator/1.log" Mar 19 11:56:35.507771 master-0 kubenswrapper[7642]: I0319 11:56:35.506308 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"d743fbff3bc780b360a4927f7a272f0fb5047ed94867c7848a01cf51863441ae"} Mar 19 11:56:35.510619 master-0 kubenswrapper[7642]: I0319 11:56:35.510593 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_de6765b7-3576-4171-972b-20d62d81f25d/installer/0.log" Mar 19 11:56:35.510711 master-0 kubenswrapper[7642]: I0319 11:56:35.510630 7642 generic.go:334] "Generic (PLEG): container finished" podID="de6765b7-3576-4171-972b-20d62d81f25d" containerID="f6792bda14390e5172a259445357feef34047b3b4035a2f72b67e0cd845a049a" exitCode=1 Mar 19 11:56:35.510711 master-0 kubenswrapper[7642]: I0319 11:56:35.510699 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"de6765b7-3576-4171-972b-20d62d81f25d","Type":"ContainerDied","Data":"f6792bda14390e5172a259445357feef34047b3b4035a2f72b67e0cd845a049a"} Mar 19 11:56:35.514597 master-0 kubenswrapper[7642]: I0319 11:56:35.514560 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerStarted","Data":"fc31578b7ae316f99c92805e4da52c101e78e491450534381d082992bf61e52a"} Mar 19 11:56:35.516968 master-0 kubenswrapper[7642]: I0319 11:56:35.516912 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerStarted","Data":"6c401036ffac2cfdbd3366c4981a259963d196c9de9faa702ed7fb448793228c"} Mar 19 11:56:35.534680 master-0 kubenswrapper[7642]: I0319 11:56:35.534426 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4h84v" podStartSLOduration=87.898426741 podStartE2EDuration="1m30.534344248s" podCreationTimestamp="2026-03-19 11:55:05 +0000 UTC" firstStartedPulling="2026-03-19 11:55:25.692329735 +0000 UTC m=+83.809770522" lastFinishedPulling="2026-03-19 11:55:28.328247242 +0000 UTC m=+86.445688029" observedRunningTime="2026-03-19 11:56:35.513231229 +0000 UTC m=+153.630672016" watchObservedRunningTime="2026-03-19 11:56:35.534344248 +0000 UTC m=+153.651785355" Mar 19 11:56:35.597888 master-0 kubenswrapper[7642]: I0319 11:56:35.597756 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podStartSLOduration=116.597735497 podStartE2EDuration="1m56.597735497s" podCreationTimestamp="2026-03-19 11:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:56:35.595230834 +0000 UTC m=+153.712671641" watchObservedRunningTime="2026-03-19 11:56:35.597735497 +0000 UTC m=+153.715176294" Mar 19 11:56:36.079593 master-0 kubenswrapper[7642]: I0319 11:56:36.079535 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42edeefc-c920-4cd7-88b0-8c1e660abf63" path="/var/lib/kubelet/pods/42edeefc-c920-4cd7-88b0-8c1e660abf63/volumes" Mar 19 11:56:36.081597 master-0 kubenswrapper[7642]: I0319 11:56:36.081553 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" path="/var/lib/kubelet/pods/6191b9b7-8e43-4006-ab25-734c23d34cf6/volumes" Mar 19 11:56:36.083806 master-0 kubenswrapper[7642]: I0319 11:56:36.083747 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751316a7-cd5f-4787-b24f-f1cddeb6a29e" path="/var/lib/kubelet/pods/751316a7-cd5f-4787-b24f-f1cddeb6a29e/volumes" Mar 19 11:56:36.084400 master-0 kubenswrapper[7642]: I0319 11:56:36.084364 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c54300b-f091-488f-b7e6-0fc27d0453e6" path="/var/lib/kubelet/pods/7c54300b-f091-488f-b7e6-0fc27d0453e6/volumes" Mar 19 11:56:36.084985 master-0 kubenswrapper[7642]: I0319 11:56:36.084947 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" path="/var/lib/kubelet/pods/a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32/volumes" Mar 19 11:56:36.332350 master-0 kubenswrapper[7642]: E0319 11:56:36.332065 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:26Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:26Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:26Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T11:56:26Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:3ea089ab116e164d89b46dc077f87d9af22f525bc2d69403214f77ee3fd30161\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d9cbffb5a2fd538c8f19b7174d2906286acdb37a574b9dce3f9da302074591ff\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746416849},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2abc1fd79e7781634ed5ed9e8f2b98b9094ea51f40ac3a773c5e5224607bf3d7\\\"],\\\"sizeBytes\\\":1637455533},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c9f7bbe4799eaacbfbb60eb906000d7a813a580d6a9740def7da774cbc4cf859\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cde1da53dadc54c24c10cab8fd3e67839ce68c33ec3b556c255a79167881966a\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252053726},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a55ec7ec64efd0f595d8084377b7e463a1807829b7617e5d4a9092dcd924c36\\\"],\\\"sizeBytes\\\":1238100502},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:aefc421cf2f5dba925f7c149d56ce14e910fbd969a4e22b5917fc912ca33a5b2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:da1ee8c9ae2cb275833f329b3d793a9109915be16d938f208ec917b50d9dd66a\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:af0fe0ca926422a6471d5bf22fc0e682c36c24fba05496a3bdfac0b7d3733015\\\"],\\\"sizeBytes\\\":991832673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\\\"],\\\"sizeBytes\\\":943841779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b00c42562d477ef44d51f35950253a26d7debc7de86e53270831aafef5795c1\\\"],\\\"sizeBytes\\\":918289953},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0a09f5a3ba4f60cce0145769509bab92553c8075d210af4ac058965d2ae11efa\\\"],\\\"sizeBytes\\\":876160834},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6bba3f73c0066e42b24839e0d29f5dce2f36436f0a11f9f5e1029bccc5ed6578\\\"],\\\"sizeBytes\\\":862657321},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a9e8da5c6114f062b814936d4db7a47a04d248e160d6bb28ad4e4a081496ee4\\\"],\\\"sizeBytes\\\":772943435},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1e1faad2d9167d84e23585c1cea5962301845548043cf09578f943f79ca98016\\\"],\\\"sizeBytes\\\":687949580},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5e12e4dc52214d3ada5ba5106caebe079eac1d9292c2571a5fe83411ce8e900d\\\"],\\\"sizeBytes\\\":683195416},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa5e782406f71c048b1ac3a4bf5d1227ff4be81111114083ad4c7a209c6bfb5a\\\"],\\\"sizeBytes\\\":677942383},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ec8fd46dfb35ed10e8f98933166f69ce579c2f35b8db03d21e4c34fc544553e4\\\"],\\\"sizeBytes\\\":621648710},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae50e496bd6ae2d27298d997470b7cb0a426eeb8b7e2e9c7187a34cb03993998\\\"],\\\"sizeBytes\\\":589386806},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c6a4383333a1fd6d05c3f60ec793913f7937ee3d77f002d85e6c61e20507bf55\\\"],\\\"sizeBytes\\\":582154903},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2dd7a03348212e49876f5359f233d893a541ed9b934df390201a05133a06982\\\"],\\\"sizeBytes\\\":558211175},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7af9f5c5af9d529840233ef4b519120cc0e3f14c4fe28cc43b0823f2c11d8f89\\\"],\\\"sizeBytes\\\":548752816},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e29dc9f042f2d0471171a0611070886cb2f7c57338ab7f112613417bcd33b278\\\"],\\\"sizeBytes\\\":529326739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b4c9cf268bb7abef7af187cd775d3f74d0bd33626250095428d53b705ee946\\\"],\\\"sizeBytes\\\":528956487},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a971d5889f167cfe61a64c366424b87c17a6dc141ffcc43406cdcbb50cae2a\\\"],\\\"sizeBytes\\\":518384969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:59727c4b3fef19e5149675cf3350735bbfe2c6588a57654b2e4552dd719f58b1\\\"],\\\"sizeBytes\\\":517999161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\\\"],\\\"sizeBytes\\\":514984269},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bfe394b58ec6195de8b8420e781b7630d85a412b9112d892fea903f92b783427\\\"],\\\"sizeBytes\\\":513221333},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f23bac0a2a6cfd638e4af679dc787a8790d99c391f6e2ade8087dc477ff765e\\\"],\\\"sizeBytes\\\":512274055},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:77fff570657d2fa0bfb709b2c8b6665bae0bf90a2be981d8dbca56c674715098\\\"],\\\"sizeBytes\\\":511227324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adb9f6f2fd701863c7caed747df43f83d3569ba9388cfa33ea7219ac6a606b11\\\"],\\\"sizeBytes\\\":511164375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c032f87ae61d6f757ff3ce52620a70a43516591987731f25da77aba152f17458\\\"],\\\"sizeBytes\\\":508888171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:812819a9d712b9e345ef5f1404b242c281e2518ad724baebc393ec0fd3b3d263\\\"],\\\"sizeBytes\\\":508544745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:313d1d8ca85e65236a59f058a3316c49436dde691b3a3930d5bc5e3b4b8c8a71\\\"],\\\"sizeBytes\\\":507972093},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c527b4e8239a1f4f4e0a851113e7dd633b7dcb9d75b0e7b21c23d26304abcb3\\\"],\\\"sizeBytes\\\":506480167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ef199844317b7b012879ed8d29f9b6bc37fad8a6fdb336103cbd5cabc74c4302\\\"],\\\"sizeBytes\\\":506395599},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7d4a034950346bcd4e36e9e2f1343e0cf7a10cf544963f33d09c7eb2a1bfc634\\\"],\\\"sizeBytes\\\":505345991},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1fbbcb390de2563a0177b92fba1b5a65777366e2dc80e2808b61d87c41b47a2d\\\"],\\\"sizeBytes\\\":505246690},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c983016b9ceed0fca1f51bd49c2653243c7e5af91cbf2f478b091db6e028252\\\"],\\\"sizeBytes\\\":504625081},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:712d334b7752d95580571059aae2c50e111d879af4fd8ea7cc3dbaf1a8e7dc69\\\"],\\\"sizeBytes\\\":495994673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4b5ea1ef4e09b673a0c68c8848ca162ab11d9ac373a377daa52dea702ffa3023\\\"],\\\"sizeBytes\\\":495065340},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:446bedea4916d3c1ee52be94137e484659e9561bd1de95c8189eee279aae984b\\\"],\\\"sizeBytes\\\":487096305},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a746a87b784ea1caa278fd0e012554f9df520b6fff665ea0bc4c83f487fed113\\\"],\\\"sizeBytes\\\":484450894},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c4d5a681595e428ff4b5083648c13615eed80be9084a3d3fc68a0295079cb12\\\"],\\\"sizeBytes\\\":484187929},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f933312f49083e8746fc41ab5e46a9a757b448374f14971e256ebcb36f11dd97\\\"],\\\"sizeBytes\\\":470826739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ea5c8a93f30e0a4932da5697d22c0da7eda9a7035c0555eb006b6755e62bb2fc\\\"],\\\"sizeBytes\\\":468265024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d12d0dc7eb86bbedf6b2d7689a28fd51f0d928f720e4a6783744304297c661ed\\\"],\\\"sizeBytes\\\":465090934},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9609c00207cc4db97f0fd6162eb429d7f81654137f020a677e30cba26a887a24\\\"],\\\"sizeBytes\\\":463705930},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:632e80bba5077068ecca05fddb95aedebad4493af6f36152c01c6ae490975b62\\\"],\\\"sizeBytes\\\":458126937},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bcb08821551e9a5b9f82aa794bcea673279cefb93cb47492e19ccac5e2cf18fe\\\"],\\\"sizeBytes\\\":456576198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:759fb1d5353dbbadd443f38631d977ca3aed9787b873be05cc9660532a252739\\\"],\\\"sizeBytes\\\":448828620},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3062f6485aec4770e60852b535c69a42527b305161fe856499c8658ead6d1e85\\\"],\\\"sizeBytes\\\":448042136}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:36.527814 master-0 kubenswrapper[7642]: I0319 11:56:36.527745 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_eb4c5b33-58c8-4b51-b71b-1befc165f2b1/installer/0.log" Mar 19 11:56:36.527814 master-0 kubenswrapper[7642]: I0319 11:56:36.527806 7642 generic.go:334] "Generic (PLEG): container finished" podID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerID="bff053145b8a60a46c397e36d3fff2a19f60020e8f6b7b60fcc48ee33442df16" exitCode=1 Mar 19 11:56:36.528687 master-0 kubenswrapper[7642]: I0319 11:56:36.528489 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"eb4c5b33-58c8-4b51-b71b-1befc165f2b1","Type":"ContainerDied","Data":"bff053145b8a60a46c397e36d3fff2a19f60020e8f6b7b60fcc48ee33442df16"} Mar 19 11:56:36.820700 master-0 kubenswrapper[7642]: I0319 11:56:36.816279 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_de6765b7-3576-4171-972b-20d62d81f25d/installer/0.log" Mar 19 11:56:36.820700 master-0 kubenswrapper[7642]: I0319 11:56:36.816536 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:56:36.973269 master-0 kubenswrapper[7642]: I0319 11:56:36.973201 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6765b7-3576-4171-972b-20d62d81f25d-kube-api-access\") pod \"de6765b7-3576-4171-972b-20d62d81f25d\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " Mar 19 11:56:36.973269 master-0 kubenswrapper[7642]: I0319 11:56:36.973269 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-kubelet-dir\") pod \"de6765b7-3576-4171-972b-20d62d81f25d\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " Mar 19 11:56:36.973557 master-0 kubenswrapper[7642]: I0319 11:56:36.973341 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-var-lock\") pod \"de6765b7-3576-4171-972b-20d62d81f25d\" (UID: \"de6765b7-3576-4171-972b-20d62d81f25d\") " Mar 19 11:56:36.973557 master-0 kubenswrapper[7642]: I0319 11:56:36.973436 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de6765b7-3576-4171-972b-20d62d81f25d" (UID: "de6765b7-3576-4171-972b-20d62d81f25d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:56:36.973557 master-0 kubenswrapper[7642]: I0319 11:56:36.973493 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-var-lock" (OuterVolumeSpecName: "var-lock") pod "de6765b7-3576-4171-972b-20d62d81f25d" (UID: "de6765b7-3576-4171-972b-20d62d81f25d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:56:36.973720 master-0 kubenswrapper[7642]: I0319 11:56:36.973697 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:56:36.973720 master-0 kubenswrapper[7642]: I0319 11:56:36.973717 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de6765b7-3576-4171-972b-20d62d81f25d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:56:36.976133 master-0 kubenswrapper[7642]: I0319 11:56:36.976099 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6765b7-3576-4171-972b-20d62d81f25d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de6765b7-3576-4171-972b-20d62d81f25d" (UID: "de6765b7-3576-4171-972b-20d62d81f25d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:56:37.075592 master-0 kubenswrapper[7642]: I0319 11:56:37.074996 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de6765b7-3576-4171-972b-20d62d81f25d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:56:37.534061 master-0 kubenswrapper[7642]: I0319 11:56:37.533995 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_de6765b7-3576-4171-972b-20d62d81f25d/installer/0.log" Mar 19 11:56:37.534581 master-0 kubenswrapper[7642]: I0319 11:56:37.534163 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 11:56:37.534581 master-0 kubenswrapper[7642]: I0319 11:56:37.534211 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"de6765b7-3576-4171-972b-20d62d81f25d","Type":"ContainerDied","Data":"503bc58279f464c4d0d4ae650c56c03affe2a73598df280a9e969caf566182ed"} Mar 19 11:56:37.534581 master-0 kubenswrapper[7642]: I0319 11:56:37.534241 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="503bc58279f464c4d0d4ae650c56c03affe2a73598df280a9e969caf566182ed" Mar 19 11:56:37.796148 master-0 kubenswrapper[7642]: I0319 11:56:37.796039 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_eb4c5b33-58c8-4b51-b71b-1befc165f2b1/installer/0.log" Mar 19 11:56:37.796148 master-0 kubenswrapper[7642]: I0319 11:56:37.796145 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:56:37.887672 master-0 kubenswrapper[7642]: I0319 11:56:37.887336 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-var-lock\") pod \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " Mar 19 11:56:37.887672 master-0 kubenswrapper[7642]: I0319 11:56:37.887475 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kube-api-access\") pod \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " Mar 19 11:56:37.887672 master-0 kubenswrapper[7642]: I0319 11:56:37.887529 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kubelet-dir\") pod \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\" (UID: \"eb4c5b33-58c8-4b51-b71b-1befc165f2b1\") " Mar 19 11:56:37.888028 master-0 kubenswrapper[7642]: I0319 11:56:37.887796 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-var-lock" (OuterVolumeSpecName: "var-lock") pod "eb4c5b33-58c8-4b51-b71b-1befc165f2b1" (UID: "eb4c5b33-58c8-4b51-b71b-1befc165f2b1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:56:37.888028 master-0 kubenswrapper[7642]: I0319 11:56:37.887910 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eb4c5b33-58c8-4b51-b71b-1befc165f2b1" (UID: "eb4c5b33-58c8-4b51-b71b-1befc165f2b1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:56:37.893662 master-0 kubenswrapper[7642]: I0319 11:56:37.891259 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eb4c5b33-58c8-4b51-b71b-1befc165f2b1" (UID: "eb4c5b33-58c8-4b51-b71b-1befc165f2b1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:56:37.988932 master-0 kubenswrapper[7642]: I0319 11:56:37.988866 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:56:37.988932 master-0 kubenswrapper[7642]: I0319 11:56:37.988907 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:56:37.988932 master-0 kubenswrapper[7642]: I0319 11:56:37.988935 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb4c5b33-58c8-4b51-b71b-1befc165f2b1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:56:38.542237 master-0 kubenswrapper[7642]: I0319 11:56:38.542160 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_eb4c5b33-58c8-4b51-b71b-1befc165f2b1/installer/0.log" Mar 19 11:56:38.542237 master-0 kubenswrapper[7642]: I0319 11:56:38.542243 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"eb4c5b33-58c8-4b51-b71b-1befc165f2b1","Type":"ContainerDied","Data":"4d0a460410b93716bcb9bd13e502416121bea8d7a0c8c686d890ff048f639230"} Mar 19 11:56:38.543277 master-0 kubenswrapper[7642]: I0319 11:56:38.542292 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0a460410b93716bcb9bd13e502416121bea8d7a0c8c686d890ff048f639230" Mar 19 11:56:38.543277 master-0 kubenswrapper[7642]: I0319 11:56:38.542367 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 11:56:39.172884 master-0 kubenswrapper[7642]: I0319 11:56:39.172793 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:39.172884 master-0 kubenswrapper[7642]: I0319 11:56:39.172893 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:39.200652 master-0 kubenswrapper[7642]: I0319 11:56:39.200576 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:39.561744 master-0 kubenswrapper[7642]: I0319 11:56:39.561540 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:56:40.224416 master-0 kubenswrapper[7642]: I0319 11:56:40.224318 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:56:40.224416 master-0 kubenswrapper[7642]: I0319 11:56:40.224431 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:56:41.326773 master-0 kubenswrapper[7642]: I0319 11:56:41.326696 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:56:42.562858 master-0 kubenswrapper[7642]: I0319 11:56:42.562557 7642 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:44.193745 master-0 kubenswrapper[7642]: I0319 11:56:44.193685 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:46.068232 master-0 kubenswrapper[7642]: I0319 11:56:46.068154 7642 scope.go:117] "RemoveContainer" containerID="0cc3ae6879b287fcc4bae5ddccb1f2e7e024042e8efb6dfd62d69b2efa286229" Mar 19 11:56:46.607198 master-0 kubenswrapper[7642]: I0319 11:56:46.607151 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-5qtx7_b0133cb5-e425-40af-a051-71b2362a89c3/network-operator/1.log" Mar 19 11:56:46.607459 master-0 kubenswrapper[7642]: I0319 11:56:46.607243 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"a0078d1886890ad6f9b53500e856885a52b81c9832530c9463ebafbbef1dc099"} Mar 19 11:56:47.782242 master-0 kubenswrapper[7642]: E0319 11:56:47.782146 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:56:50.225237 master-0 kubenswrapper[7642]: I0319 11:56:50.225162 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:56:50.226104 master-0 kubenswrapper[7642]: I0319 11:56:50.225872 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:56:52.561297 master-0 kubenswrapper[7642]: I0319 11:56:52.561166 7642 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:00.226696 master-0 kubenswrapper[7642]: I0319 11:57:00.226551 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 11:57:00.227788 master-0 kubenswrapper[7642]: I0319 11:57:00.227732 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.128.0.56:8443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 11:57:01.639387 master-0 kubenswrapper[7642]: E0319 11:57:01.639296 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 19 11:57:02.466991 master-0 kubenswrapper[7642]: I0319 11:57:02.466940 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:02.474977 master-0 kubenswrapper[7642]: I0319 11:57:02.474912 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701006 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt"] Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701255 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerName="extract-content" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701269 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerName="extract-content" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701278 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701284 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701295 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701302 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701312 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6765b7-3576-4171-972b-20d62d81f25d" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701320 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6765b7-3576-4171-972b-20d62d81f25d" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701332 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerName="extract-utilities" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701339 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerName="extract-utilities" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701353 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerName="extract-content" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701359 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerName="extract-content" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701368 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerName="extract-utilities" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701374 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerName="extract-utilities" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: E0319 11:57:02.701385 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701393 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701481 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701493 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701503 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6765b7-3576-4171-972b-20d62d81f25d" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701517 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="6191b9b7-8e43-4006-ab25-734c23d34cf6" containerName="extract-content" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701525 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cbb4c1-3ea2-41b1-847b-b886e2c4ae32" containerName="installer" Mar 19 11:57:02.701659 master-0 kubenswrapper[7642]: I0319 11:57:02.701531 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="42edeefc-c920-4cd7-88b0-8c1e660abf63" containerName="extract-content" Mar 19 11:57:02.702997 master-0 kubenswrapper[7642]: I0319 11:57:02.701989 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.705939 master-0 kubenswrapper[7642]: I0319 11:57:02.705882 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79589db566-gb7tq"] Mar 19 11:57:02.706707 master-0 kubenswrapper[7642]: I0319 11:57:02.706682 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.707090 master-0 kubenswrapper[7642]: I0319 11:57:02.707059 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-62ztt" Mar 19 11:57:02.714871 master-0 kubenswrapper[7642]: I0319 11:57:02.714823 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 11:57:02.715310 master-0 kubenswrapper[7642]: I0319 11:57:02.715196 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 11:57:02.715726 master-0 kubenswrapper[7642]: I0319 11:57:02.715334 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 11:57:02.715726 master-0 kubenswrapper[7642]: I0319 11:57:02.715434 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 11:57:02.715726 master-0 kubenswrapper[7642]: I0319 11:57:02.715497 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 11:57:02.715726 master-0 kubenswrapper[7642]: I0319 11:57:02.715524 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 11:57:02.715726 master-0 kubenswrapper[7642]: I0319 11:57:02.715520 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 11:57:02.716327 master-0 kubenswrapper[7642]: I0319 11:57:02.716162 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 11:57:02.728245 master-0 kubenswrapper[7642]: I0319 11:57:02.726744 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 11:57:02.728245 master-0 kubenswrapper[7642]: I0319 11:57:02.726958 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 11:57:02.728559 master-0 kubenswrapper[7642]: I0319 11:57:02.728529 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 11:57:02.730604 master-0 kubenswrapper[7642]: I0319 11:57:02.730567 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt"] Mar 19 11:57:02.750342 master-0 kubenswrapper[7642]: I0319 11:57:02.750300 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:57:02.754748 master-0 kubenswrapper[7642]: I0319 11:57:02.754671 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/0.log" Mar 19 11:57:02.754927 master-0 kubenswrapper[7642]: I0319 11:57:02.754752 7642 generic.go:334] "Generic (PLEG): container finished" podID="f38df702-a256-4f83-90bb-0c0e477a2ce6" containerID="a33c34d1a46884ff2764e0114249d4b1bcbad80e05aa9b1072bd5b5574041486" exitCode=1 Mar 19 11:57:02.755995 master-0 kubenswrapper[7642]: I0319 11:57:02.755011 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerDied","Data":"a33c34d1a46884ff2764e0114249d4b1bcbad80e05aa9b1072bd5b5574041486"} Mar 19 11:57:02.755995 master-0 kubenswrapper[7642]: I0319 11:57:02.755550 7642 scope.go:117] "RemoveContainer" containerID="a33c34d1a46884ff2764e0114249d4b1bcbad80e05aa9b1072bd5b5574041486" Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: I0319 11:57:02.760918 7642 patch_prober.go:28] interesting pod/apiserver-5c7d7c6bb8-gnz2z container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]log ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]etcd excluded: ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]etcd-readiness excluded: ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [-]informer-sync failed: reason withheld Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/max-in-flight-filter ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/openshift.io-StartUserInformer ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/openshift.io-StartOAuthInformer ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: [+]shutdown ok Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: readyz check failed Mar 19 11:57:02.763699 master-0 kubenswrapper[7642]: I0319 11:57:02.761046 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" podUID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:57:02.768929 master-0 kubenswrapper[7642]: I0319 11:57:02.766276 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79589db566-gb7tq"] Mar 19 11:57:02.854728 master-0 kubenswrapper[7642]: I0319 11:57:02.854580 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.855037 master-0 kubenswrapper[7642]: I0319 11:57:02.855006 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.855157 master-0 kubenswrapper[7642]: I0319 11:57:02.855145 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.855252 master-0 kubenswrapper[7642]: I0319 11:57:02.855240 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.855350 master-0 kubenswrapper[7642]: I0319 11:57:02.855332 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.855743 master-0 kubenswrapper[7642]: I0319 11:57:02.855723 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.856366 master-0 kubenswrapper[7642]: I0319 11:57:02.856341 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.856465 master-0 kubenswrapper[7642]: I0319 11:57:02.856421 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.856465 master-0 kubenswrapper[7642]: I0319 11:57:02.856454 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957339 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957412 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957446 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957475 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957501 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957521 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957547 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957566 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.958441 master-0 kubenswrapper[7642]: I0319 11:57:02.957585 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.959269 master-0 kubenswrapper[7642]: I0319 11:57:02.959231 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.959736 master-0 kubenswrapper[7642]: I0319 11:57:02.959683 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.960705 master-0 kubenswrapper[7642]: I0319 11:57:02.960655 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.962675 master-0 kubenswrapper[7642]: I0319 11:57:02.961214 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.962675 master-0 kubenswrapper[7642]: I0319 11:57:02.961502 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.972621 master-0 kubenswrapper[7642]: I0319 11:57:02.965282 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:02.972621 master-0 kubenswrapper[7642]: I0319 11:57:02.965516 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:02.982231 master-0 kubenswrapper[7642]: I0319 11:57:02.982116 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:03.002665 master-0 kubenswrapper[7642]: I0319 11:57:03.002596 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:03.026217 master-0 kubenswrapper[7642]: I0319 11:57:03.026153 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:03.055850 master-0 kubenswrapper[7642]: I0319 11:57:03.053148 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:03.473616 master-0 kubenswrapper[7642]: I0319 11:57:03.473552 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt"] Mar 19 11:57:03.481589 master-0 kubenswrapper[7642]: W0319 11:57:03.481551 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12cbb2fe_3077_4b69_a6e1_ee5663fe7bbc.slice/crio-9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f WatchSource:0}: Error finding container 9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f: Status 404 returned error can't find the container with id 9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f Mar 19 11:57:03.547327 master-0 kubenswrapper[7642]: I0319 11:57:03.547257 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79589db566-gb7tq"] Mar 19 11:57:03.766193 master-0 kubenswrapper[7642]: I0319 11:57:03.766048 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerStarted","Data":"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464"} Mar 19 11:57:03.766193 master-0 kubenswrapper[7642]: I0319 11:57:03.766119 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerStarted","Data":"a297c2fb1d857d07bfeb6ac6ca57bb65e15ebd85864af6b5b40028961499f08b"} Mar 19 11:57:03.767485 master-0 kubenswrapper[7642]: I0319 11:57:03.767458 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:03.770381 master-0 kubenswrapper[7642]: I0319 11:57:03.769914 7642 patch_prober.go:28] interesting pod/controller-manager-79589db566-gb7tq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.62:8443/healthz\": dial tcp 10.128.0.62:8443: connect: connection refused" start-of-body= Mar 19 11:57:03.770381 master-0 kubenswrapper[7642]: I0319 11:57:03.770006 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.62:8443/healthz\": dial tcp 10.128.0.62:8443: connect: connection refused" Mar 19 11:57:03.770381 master-0 kubenswrapper[7642]: I0319 11:57:03.770290 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerStarted","Data":"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b"} Mar 19 11:57:03.770381 master-0 kubenswrapper[7642]: I0319 11:57:03.770347 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerStarted","Data":"9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f"} Mar 19 11:57:03.770876 master-0 kubenswrapper[7642]: I0319 11:57:03.770849 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:03.773097 master-0 kubenswrapper[7642]: I0319 11:57:03.773069 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/0.log" Mar 19 11:57:03.773179 master-0 kubenswrapper[7642]: I0319 11:57:03.773116 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"7316631a0f45a2c1ef1105021be26127f5494db22ff552f0eb4f6711db06819f"} Mar 19 11:57:03.789749 master-0 kubenswrapper[7642]: I0319 11:57:03.789627 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" podStartSLOduration=116.788396107 podStartE2EDuration="1m56.788396107s" podCreationTimestamp="2026-03-19 11:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:03.785511522 +0000 UTC m=+181.902952309" watchObservedRunningTime="2026-03-19 11:57:03.788396107 +0000 UTC m=+181.905836894" Mar 19 11:57:03.847009 master-0 kubenswrapper[7642]: I0319 11:57:03.846115 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" podStartSLOduration=116.846090509 podStartE2EDuration="1m56.846090509s" podCreationTimestamp="2026-03-19 11:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:03.844719309 +0000 UTC m=+181.962160126" watchObservedRunningTime="2026-03-19 11:57:03.846090509 +0000 UTC m=+181.963531296" Mar 19 11:57:03.927407 master-0 kubenswrapper[7642]: I0319 11:57:03.927332 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 11:57:04.784097 master-0 kubenswrapper[7642]: I0319 11:57:04.784015 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 11:57:05.199254 master-0 kubenswrapper[7642]: I0319 11:57:05.199195 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:57:05.199619 master-0 kubenswrapper[7642]: I0319 11:57:05.199564 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" containerID="cri-o://9704f6068bbc3b7071c181549835a51a1db8ec178675b82fbcbeb90ed4c62c0b" gracePeriod=30 Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: I0319 11:57:05.207717 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: E0319 11:57:05.208231 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: I0319 11:57:05.208247 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: I0319 11:57:05.208682 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: I0319 11:57:05.208695 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: E0319 11:57:05.209442 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 11:57:05.209666 master-0 kubenswrapper[7642]: I0319 11:57:05.209463 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83737980b9ee109184b1d78e942cf36" containerName="kube-scheduler" Mar 19 11:57:05.218143 master-0 kubenswrapper[7642]: I0319 11:57:05.215048 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.229040 master-0 kubenswrapper[7642]: I0319 11:57:05.228994 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 11:57:05.291237 master-0 kubenswrapper[7642]: I0319 11:57:05.291172 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.291472 master-0 kubenswrapper[7642]: I0319 11:57:05.291295 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.373755 master-0 kubenswrapper[7642]: I0319 11:57:05.373712 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:57:05.393674 master-0 kubenswrapper[7642]: I0319 11:57:05.391405 7642 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="2f089fd1-0eb5-4420-a0e5-a7c84b6b82eb" Mar 19 11:57:05.393674 master-0 kubenswrapper[7642]: I0319 11:57:05.392581 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.393674 master-0 kubenswrapper[7642]: I0319 11:57:05.392731 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.393674 master-0 kubenswrapper[7642]: I0319 11:57:05.392789 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.393674 master-0 kubenswrapper[7642]: I0319 11:57:05.392755 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:05.494972 master-0 kubenswrapper[7642]: I0319 11:57:05.494834 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 11:57:05.495336 master-0 kubenswrapper[7642]: I0319 11:57:05.494973 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") pod \"c83737980b9ee109184b1d78e942cf36\" (UID: \"c83737980b9ee109184b1d78e942cf36\") " Mar 19 11:57:05.495336 master-0 kubenswrapper[7642]: I0319 11:57:05.495038 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets" (OuterVolumeSpecName: "secrets") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:05.495336 master-0 kubenswrapper[7642]: I0319 11:57:05.495253 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs" (OuterVolumeSpecName: "logs") pod "c83737980b9ee109184b1d78e942cf36" (UID: "c83737980b9ee109184b1d78e942cf36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:05.500893 master-0 kubenswrapper[7642]: I0319 11:57:05.500848 7642 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:05.500893 master-0 kubenswrapper[7642]: I0319 11:57:05.500887 7642 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c83737980b9ee109184b1d78e942cf36-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:05.794291 master-0 kubenswrapper[7642]: I0319 11:57:05.794087 7642 generic.go:334] "Generic (PLEG): container finished" podID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerID="711f8121453bce45bc824e3d6883cc554f357db4612fdf30ce1a48bb144904ff" exitCode=0 Mar 19 11:57:05.794949 master-0 kubenswrapper[7642]: I0319 11:57:05.794389 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"c0d8ad30-feb0-453f-b652-36aafc7b24f9","Type":"ContainerDied","Data":"711f8121453bce45bc824e3d6883cc554f357db4612fdf30ce1a48bb144904ff"} Mar 19 11:57:05.798445 master-0 kubenswrapper[7642]: I0319 11:57:05.798421 7642 generic.go:334] "Generic (PLEG): container finished" podID="c83737980b9ee109184b1d78e942cf36" containerID="9704f6068bbc3b7071c181549835a51a1db8ec178675b82fbcbeb90ed4c62c0b" exitCode=0 Mar 19 11:57:05.798714 master-0 kubenswrapper[7642]: I0319 11:57:05.798679 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 19 11:57:05.798813 master-0 kubenswrapper[7642]: I0319 11:57:05.798791 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb1ab82ad9ee6b742e302a5269a9952a1cd4daf4667abd8007cfbd42d95d64c7" Mar 19 11:57:05.798813 master-0 kubenswrapper[7642]: I0319 11:57:05.798814 7642 scope.go:117] "RemoveContainer" containerID="79f33a58d544fe5d9dca133c52ef1b1ffeca07d3925205b6553e47537ece7b51" Mar 19 11:57:06.083459 master-0 kubenswrapper[7642]: I0319 11:57:06.083306 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83737980b9ee109184b1d78e942cf36" path="/var/lib/kubelet/pods/c83737980b9ee109184b1d78e942cf36/volumes" Mar 19 11:57:06.083752 master-0 kubenswrapper[7642]: I0319 11:57:06.083651 7642 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 19 11:57:06.100954 master-0 kubenswrapper[7642]: I0319 11:57:06.100886 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:57:06.100954 master-0 kubenswrapper[7642]: I0319 11:57:06.100941 7642 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="2f089fd1-0eb5-4420-a0e5-a7c84b6b82eb" Mar 19 11:57:06.104259 master-0 kubenswrapper[7642]: I0319 11:57:06.104217 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 19 11:57:06.104406 master-0 kubenswrapper[7642]: I0319 11:57:06.104386 7642 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="2f089fd1-0eb5-4420-a0e5-a7c84b6b82eb" Mar 19 11:57:06.662563 master-0 kubenswrapper[7642]: I0319 11:57:06.662484 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vj6x4"] Mar 19 11:57:06.663311 master-0 kubenswrapper[7642]: I0319 11:57:06.663010 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vj6x4" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="registry-server" containerID="cri-o://ba302a2152520867d35d857dfa783d08a5a5e0bd8bec8b04277e2db3211d98c6" gracePeriod=2 Mar 19 11:57:06.812583 master-0 kubenswrapper[7642]: I0319 11:57:06.812497 7642 generic.go:334] "Generic (PLEG): container finished" podID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerID="ba302a2152520867d35d857dfa783d08a5a5e0bd8bec8b04277e2db3211d98c6" exitCode=0 Mar 19 11:57:06.813273 master-0 kubenswrapper[7642]: I0319 11:57:06.812803 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vj6x4" event={"ID":"aedae669-04fb-44e7-afac-03bd0bfae4e9","Type":"ContainerDied","Data":"ba302a2152520867d35d857dfa783d08a5a5e0bd8bec8b04277e2db3211d98c6"} Mar 19 11:57:06.879095 master-0 kubenswrapper[7642]: I0319 11:57:06.868974 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmnd7"] Mar 19 11:57:06.879497 master-0 kubenswrapper[7642]: I0319 11:57:06.879445 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cmnd7" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="registry-server" containerID="cri-o://d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168" gracePeriod=2 Mar 19 11:57:07.280664 master-0 kubenswrapper[7642]: I0319 11:57:07.280522 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:57:07.297655 master-0 kubenswrapper[7642]: I0319 11:57:07.288797 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:57:07.357090 master-0 kubenswrapper[7642]: I0319 11:57:07.356948 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-utilities\") pod \"aedae669-04fb-44e7-afac-03bd0bfae4e9\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " Mar 19 11:57:07.357090 master-0 kubenswrapper[7642]: I0319 11:57:07.357028 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kube-api-access\") pod \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " Mar 19 11:57:07.357090 master-0 kubenswrapper[7642]: I0319 11:57:07.357055 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdbzg\" (UniqueName: \"kubernetes.io/projected/aedae669-04fb-44e7-afac-03bd0bfae4e9-kube-api-access-mdbzg\") pod \"aedae669-04fb-44e7-afac-03bd0bfae4e9\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " Mar 19 11:57:07.357090 master-0 kubenswrapper[7642]: I0319 11:57:07.357079 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kubelet-dir\") pod \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " Mar 19 11:57:07.357090 master-0 kubenswrapper[7642]: I0319 11:57:07.357103 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-var-lock\") pod \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\" (UID: \"c0d8ad30-feb0-453f-b652-36aafc7b24f9\") " Mar 19 11:57:07.357675 master-0 kubenswrapper[7642]: I0319 11:57:07.357367 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-var-lock" (OuterVolumeSpecName: "var-lock") pod "c0d8ad30-feb0-453f-b652-36aafc7b24f9" (UID: "c0d8ad30-feb0-453f-b652-36aafc7b24f9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:07.358796 master-0 kubenswrapper[7642]: I0319 11:57:07.358051 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c0d8ad30-feb0-453f-b652-36aafc7b24f9" (UID: "c0d8ad30-feb0-453f-b652-36aafc7b24f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:07.359617 master-0 kubenswrapper[7642]: I0319 11:57:07.359304 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-utilities" (OuterVolumeSpecName: "utilities") pod "aedae669-04fb-44e7-afac-03bd0bfae4e9" (UID: "aedae669-04fb-44e7-afac-03bd0bfae4e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:07.362393 master-0 kubenswrapper[7642]: I0319 11:57:07.361780 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedae669-04fb-44e7-afac-03bd0bfae4e9-kube-api-access-mdbzg" (OuterVolumeSpecName: "kube-api-access-mdbzg") pod "aedae669-04fb-44e7-afac-03bd0bfae4e9" (UID: "aedae669-04fb-44e7-afac-03bd0bfae4e9"). InnerVolumeSpecName "kube-api-access-mdbzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:07.363808 master-0 kubenswrapper[7642]: I0319 11:57:07.363778 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c0d8ad30-feb0-453f-b652-36aafc7b24f9" (UID: "c0d8ad30-feb0-453f-b652-36aafc7b24f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:07.401484 master-0 kubenswrapper[7642]: I0319 11:57:07.401450 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:57:07.459252 master-0 kubenswrapper[7642]: I0319 11:57:07.459157 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-catalog-content\") pod \"aedae669-04fb-44e7-afac-03bd0bfae4e9\" (UID: \"aedae669-04fb-44e7-afac-03bd0bfae4e9\") " Mar 19 11:57:07.460158 master-0 kubenswrapper[7642]: I0319 11:57:07.460131 7642 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.460214 master-0 kubenswrapper[7642]: I0319 11:57:07.460163 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.460214 master-0 kubenswrapper[7642]: I0319 11:57:07.460182 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdbzg\" (UniqueName: \"kubernetes.io/projected/aedae669-04fb-44e7-afac-03bd0bfae4e9-kube-api-access-mdbzg\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.460214 master-0 kubenswrapper[7642]: I0319 11:57:07.460197 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.460214 master-0 kubenswrapper[7642]: I0319 11:57:07.460210 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0d8ad30-feb0-453f-b652-36aafc7b24f9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.524280 master-0 kubenswrapper[7642]: I0319 11:57:07.524211 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aedae669-04fb-44e7-afac-03bd0bfae4e9" (UID: "aedae669-04fb-44e7-afac-03bd0bfae4e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:07.561421 master-0 kubenswrapper[7642]: I0319 11:57:07.561363 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-utilities\") pod \"c5426417-d063-49e6-9fc9-f40ccf8296dc\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " Mar 19 11:57:07.561617 master-0 kubenswrapper[7642]: I0319 11:57:07.561593 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-catalog-content\") pod \"c5426417-d063-49e6-9fc9-f40ccf8296dc\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " Mar 19 11:57:07.561742 master-0 kubenswrapper[7642]: I0319 11:57:07.561717 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9597m\" (UniqueName: \"kubernetes.io/projected/c5426417-d063-49e6-9fc9-f40ccf8296dc-kube-api-access-9597m\") pod \"c5426417-d063-49e6-9fc9-f40ccf8296dc\" (UID: \"c5426417-d063-49e6-9fc9-f40ccf8296dc\") " Mar 19 11:57:07.561997 master-0 kubenswrapper[7642]: I0319 11:57:07.561971 7642 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aedae669-04fb-44e7-afac-03bd0bfae4e9-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.562447 master-0 kubenswrapper[7642]: I0319 11:57:07.562392 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-utilities" (OuterVolumeSpecName: "utilities") pod "c5426417-d063-49e6-9fc9-f40ccf8296dc" (UID: "c5426417-d063-49e6-9fc9-f40ccf8296dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:07.568606 master-0 kubenswrapper[7642]: I0319 11:57:07.568552 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5426417-d063-49e6-9fc9-f40ccf8296dc-kube-api-access-9597m" (OuterVolumeSpecName: "kube-api-access-9597m") pod "c5426417-d063-49e6-9fc9-f40ccf8296dc" (UID: "c5426417-d063-49e6-9fc9-f40ccf8296dc"). InnerVolumeSpecName "kube-api-access-9597m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:07.590134 master-0 kubenswrapper[7642]: I0319 11:57:07.588518 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5426417-d063-49e6-9fc9-f40ccf8296dc" (UID: "c5426417-d063-49e6-9fc9-f40ccf8296dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:57:07.663439 master-0 kubenswrapper[7642]: I0319 11:57:07.663364 7642 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.663439 master-0 kubenswrapper[7642]: I0319 11:57:07.663422 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9597m\" (UniqueName: \"kubernetes.io/projected/c5426417-d063-49e6-9fc9-f40ccf8296dc-kube-api-access-9597m\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.663439 master-0 kubenswrapper[7642]: I0319 11:57:07.663445 7642 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5426417-d063-49e6-9fc9-f40ccf8296dc-utilities\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:07.821540 master-0 kubenswrapper[7642]: I0319 11:57:07.821482 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"c0d8ad30-feb0-453f-b652-36aafc7b24f9","Type":"ContainerDied","Data":"ab9b7175a769b8cf6fa9915cafc7ebea7b9ffcfdeed5fdbb917a19708d7ff10e"} Mar 19 11:57:07.821540 master-0 kubenswrapper[7642]: I0319 11:57:07.821534 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9b7175a769b8cf6fa9915cafc7ebea7b9ffcfdeed5fdbb917a19708d7ff10e" Mar 19 11:57:07.822148 master-0 kubenswrapper[7642]: I0319 11:57:07.821850 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 11:57:07.824765 master-0 kubenswrapper[7642]: I0319 11:57:07.824718 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vj6x4" event={"ID":"aedae669-04fb-44e7-afac-03bd0bfae4e9","Type":"ContainerDied","Data":"df5f164a29ed9b0b229f66f5108c1194120dcaed538131f0457ba31bf5557108"} Mar 19 11:57:07.824829 master-0 kubenswrapper[7642]: I0319 11:57:07.824767 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vj6x4" Mar 19 11:57:07.824829 master-0 kubenswrapper[7642]: I0319 11:57:07.824789 7642 scope.go:117] "RemoveContainer" containerID="ba302a2152520867d35d857dfa783d08a5a5e0bd8bec8b04277e2db3211d98c6" Mar 19 11:57:07.828486 master-0 kubenswrapper[7642]: I0319 11:57:07.828440 7642 generic.go:334] "Generic (PLEG): container finished" podID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerID="d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168" exitCode=0 Mar 19 11:57:07.828554 master-0 kubenswrapper[7642]: I0319 11:57:07.828494 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmnd7" event={"ID":"c5426417-d063-49e6-9fc9-f40ccf8296dc","Type":"ContainerDied","Data":"d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168"} Mar 19 11:57:07.828554 master-0 kubenswrapper[7642]: I0319 11:57:07.828528 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cmnd7" event={"ID":"c5426417-d063-49e6-9fc9-f40ccf8296dc","Type":"ContainerDied","Data":"542bfeeda502ccab4e25ae324c702e09b8962ca93e8f8367750a8b97ef0ab952"} Mar 19 11:57:07.828554 master-0 kubenswrapper[7642]: I0319 11:57:07.828532 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cmnd7" Mar 19 11:57:07.843143 master-0 kubenswrapper[7642]: I0319 11:57:07.843116 7642 scope.go:117] "RemoveContainer" containerID="ceb2bdf6c11a6f05383bb7f4043b30c9b6ea152730f75477c91bd1a079edfb21" Mar 19 11:57:07.875137 master-0 kubenswrapper[7642]: I0319 11:57:07.875098 7642 scope.go:117] "RemoveContainer" containerID="f2fccfaa5c9ec30ed365e658f9b75ddf0baabf11bac6bda0ab6dfed5d569f4f6" Mar 19 11:57:07.882713 master-0 kubenswrapper[7642]: I0319 11:57:07.882669 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vj6x4"] Mar 19 11:57:07.888180 master-0 kubenswrapper[7642]: I0319 11:57:07.888141 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vj6x4"] Mar 19 11:57:07.909693 master-0 kubenswrapper[7642]: I0319 11:57:07.905954 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmnd7"] Mar 19 11:57:07.909693 master-0 kubenswrapper[7642]: I0319 11:57:07.906560 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cmnd7"] Mar 19 11:57:07.909693 master-0 kubenswrapper[7642]: I0319 11:57:07.908378 7642 scope.go:117] "RemoveContainer" containerID="d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168" Mar 19 11:57:07.923768 master-0 kubenswrapper[7642]: I0319 11:57:07.923722 7642 scope.go:117] "RemoveContainer" containerID="4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c" Mar 19 11:57:07.947424 master-0 kubenswrapper[7642]: I0319 11:57:07.947331 7642 scope.go:117] "RemoveContainer" containerID="6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f" Mar 19 11:57:07.965296 master-0 kubenswrapper[7642]: I0319 11:57:07.965173 7642 scope.go:117] "RemoveContainer" containerID="d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168" Mar 19 11:57:07.967382 master-0 kubenswrapper[7642]: E0319 11:57:07.966570 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168\": container with ID starting with d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168 not found: ID does not exist" containerID="d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168" Mar 19 11:57:07.967581 master-0 kubenswrapper[7642]: I0319 11:57:07.967382 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168"} err="failed to get container status \"d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168\": rpc error: code = NotFound desc = could not find container \"d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168\": container with ID starting with d4fcef24e331598b6586531b62321915611d9a5fa6454860d4586eb68d325168 not found: ID does not exist" Mar 19 11:57:07.967581 master-0 kubenswrapper[7642]: I0319 11:57:07.967428 7642 scope.go:117] "RemoveContainer" containerID="4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c" Mar 19 11:57:07.970756 master-0 kubenswrapper[7642]: E0319 11:57:07.970681 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c\": container with ID starting with 4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c not found: ID does not exist" containerID="4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c" Mar 19 11:57:07.970889 master-0 kubenswrapper[7642]: I0319 11:57:07.970765 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c"} err="failed to get container status \"4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c\": rpc error: code = NotFound desc = could not find container \"4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c\": container with ID starting with 4ae9f732611854a72f48be816433bc29f4a30e699052bdca3bbb642928d5945c not found: ID does not exist" Mar 19 11:57:07.970889 master-0 kubenswrapper[7642]: I0319 11:57:07.970808 7642 scope.go:117] "RemoveContainer" containerID="6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f" Mar 19 11:57:07.971449 master-0 kubenswrapper[7642]: E0319 11:57:07.971367 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f\": container with ID starting with 6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f not found: ID does not exist" containerID="6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f" Mar 19 11:57:07.971529 master-0 kubenswrapper[7642]: I0319 11:57:07.971476 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f"} err="failed to get container status \"6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f\": rpc error: code = NotFound desc = could not find container \"6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f\": container with ID starting with 6cf86f08f9d9089904c70716cb75faf402b41a2de3d87a43c0203428de52639f not found: ID does not exist" Mar 19 11:57:08.075468 master-0 kubenswrapper[7642]: I0319 11:57:08.075400 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" path="/var/lib/kubelet/pods/aedae669-04fb-44e7-afac-03bd0bfae4e9/volumes" Mar 19 11:57:08.076168 master-0 kubenswrapper[7642]: I0319 11:57:08.076142 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" path="/var/lib/kubelet/pods/c5426417-d063-49e6-9fc9-f40ccf8296dc/volumes" Mar 19 11:57:14.245296 master-0 kubenswrapper[7642]: I0319 11:57:14.244316 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:14.248090 master-0 kubenswrapper[7642]: I0319 11:57:14.246068 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 11:57:14.878313 master-0 kubenswrapper[7642]: I0319 11:57:14.878167 7642 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75" exitCode=0 Mar 19 11:57:14.878313 master-0 kubenswrapper[7642]: I0319 11:57:14.878242 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerDied","Data":"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75"} Mar 19 11:57:14.878574 master-0 kubenswrapper[7642]: I0319 11:57:14.878322 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"5de845d57b7ab43259a85cce4b6bab69ddfd8124623507e8e42c97f024c04741"} Mar 19 11:57:15.912396 master-0 kubenswrapper[7642]: I0319 11:57:15.911724 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4"} Mar 19 11:57:15.912396 master-0 kubenswrapper[7642]: I0319 11:57:15.912335 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:57:15.912396 master-0 kubenswrapper[7642]: I0319 11:57:15.912377 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1"} Mar 19 11:57:15.912396 master-0 kubenswrapper[7642]: I0319 11:57:15.912393 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8413125cf444e5c95f023c5dd9c6151e","Type":"ContainerStarted","Data":"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836"} Mar 19 11:57:15.913172 master-0 kubenswrapper[7642]: I0319 11:57:15.912799 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tzckk"] Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913414 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="extract-content" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913450 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="extract-content" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913467 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="registry-server" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913475 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="registry-server" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913493 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="extract-utilities" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913502 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="extract-utilities" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913520 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerName="installer" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913529 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerName="installer" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913546 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="registry-server" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913554 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="registry-server" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913577 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="extract-utilities" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913588 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="extract-utilities" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: E0319 11:57:15.913608 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="extract-content" Mar 19 11:57:15.913933 master-0 kubenswrapper[7642]: I0319 11:57:15.913615 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="extract-content" Mar 19 11:57:15.916511 master-0 kubenswrapper[7642]: I0319 11:57:15.916427 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerName="installer" Mar 19 11:57:15.916511 master-0 kubenswrapper[7642]: I0319 11:57:15.916495 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5426417-d063-49e6-9fc9-f40ccf8296dc" containerName="registry-server" Mar 19 11:57:15.916678 master-0 kubenswrapper[7642]: I0319 11:57:15.916664 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedae669-04fb-44e7-afac-03bd0bfae4e9" containerName="registry-server" Mar 19 11:57:15.922608 master-0 kubenswrapper[7642]: I0319 11:57:15.922543 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zxlv"] Mar 19 11:57:15.925713 master-0 kubenswrapper[7642]: I0319 11:57:15.923588 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:15.925713 master-0 kubenswrapper[7642]: I0319 11:57:15.923929 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:15.927205 master-0 kubenswrapper[7642]: I0319 11:57:15.926482 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tzckk"] Mar 19 11:57:15.929891 master-0 kubenswrapper[7642]: I0319 11:57:15.928370 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7v2tf" Mar 19 11:57:15.929891 master-0 kubenswrapper[7642]: I0319 11:57:15.928674 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bgrvv" Mar 19 11:57:15.929891 master-0 kubenswrapper[7642]: I0319 11:57:15.928827 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zxlv"] Mar 19 11:57:15.946359 master-0 kubenswrapper[7642]: I0319 11:57:15.946280 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=1.9462570380000002 podStartE2EDuration="1.946257038s" podCreationTimestamp="2026-03-19 11:57:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:15.945822475 +0000 UTC m=+194.063263262" watchObservedRunningTime="2026-03-19 11:57:15.946257038 +0000 UTC m=+194.063697825" Mar 19 11:57:16.078459 master-0 kubenswrapper[7642]: I0319 11:57:16.078377 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-catalog-content\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.078754 master-0 kubenswrapper[7642]: I0319 11:57:16.078472 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gczvq\" (UniqueName: \"kubernetes.io/projected/4d625e92-793e-4942-a826-d30d21e3ca0c-kube-api-access-gczvq\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.078754 master-0 kubenswrapper[7642]: I0319 11:57:16.078511 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-catalog-content\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.078754 master-0 kubenswrapper[7642]: I0319 11:57:16.078545 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-utilities\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.078754 master-0 kubenswrapper[7642]: I0319 11:57:16.078573 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbbm\" (UniqueName: \"kubernetes.io/projected/003e840c-1b07-4624-8225-e4ffdab74213-kube-api-access-hmbbm\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.078754 master-0 kubenswrapper[7642]: I0319 11:57:16.078623 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-utilities\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.179793 master-0 kubenswrapper[7642]: I0319 11:57:16.179513 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gczvq\" (UniqueName: \"kubernetes.io/projected/4d625e92-793e-4942-a826-d30d21e3ca0c-kube-api-access-gczvq\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.179793 master-0 kubenswrapper[7642]: I0319 11:57:16.179592 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-catalog-content\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.180224 master-0 kubenswrapper[7642]: I0319 11:57:16.179947 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-utilities\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.180224 master-0 kubenswrapper[7642]: I0319 11:57:16.180139 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbbm\" (UniqueName: \"kubernetes.io/projected/003e840c-1b07-4624-8225-e4ffdab74213-kube-api-access-hmbbm\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.180224 master-0 kubenswrapper[7642]: I0319 11:57:16.180186 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-catalog-content\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.180529 master-0 kubenswrapper[7642]: I0319 11:57:16.180225 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-utilities\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.180529 master-0 kubenswrapper[7642]: I0319 11:57:16.180453 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-catalog-content\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.180785 master-0 kubenswrapper[7642]: I0319 11:57:16.180621 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-utilities\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.180891 master-0 kubenswrapper[7642]: I0319 11:57:16.180872 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-utilities\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.181685 master-0 kubenswrapper[7642]: I0319 11:57:16.181449 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-catalog-content\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.203855 master-0 kubenswrapper[7642]: I0319 11:57:16.203796 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gczvq\" (UniqueName: \"kubernetes.io/projected/4d625e92-793e-4942-a826-d30d21e3ca0c-kube-api-access-gczvq\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.209985 master-0 kubenswrapper[7642]: I0319 11:57:16.209940 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbbm\" (UniqueName: \"kubernetes.io/projected/003e840c-1b07-4624-8225-e4ffdab74213-kube-api-access-hmbbm\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.248340 master-0 kubenswrapper[7642]: I0319 11:57:16.248083 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:16.266346 master-0 kubenswrapper[7642]: I0319 11:57:16.266033 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:16.677616 master-0 kubenswrapper[7642]: I0319 11:57:16.677557 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tzckk"] Mar 19 11:57:16.681262 master-0 kubenswrapper[7642]: W0319 11:57:16.681185 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003e840c_1b07_4624_8225_e4ffdab74213.slice/crio-cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241 WatchSource:0}: Error finding container cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241: Status 404 returned error can't find the container with id cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241 Mar 19 11:57:16.739151 master-0 kubenswrapper[7642]: I0319 11:57:16.739089 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zxlv"] Mar 19 11:57:16.764095 master-0 kubenswrapper[7642]: I0319 11:57:16.763994 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 11:57:16.764778 master-0 kubenswrapper[7642]: I0319 11:57:16.764748 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.770305 master-0 kubenswrapper[7642]: I0319 11:57:16.770268 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 11:57:16.772840 master-0 kubenswrapper[7642]: I0319 11:57:16.771548 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-w4vjc" Mar 19 11:57:16.786678 master-0 kubenswrapper[7642]: I0319 11:57:16.782089 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 11:57:16.788044 master-0 kubenswrapper[7642]: I0319 11:57:16.787997 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.788335 master-0 kubenswrapper[7642]: I0319 11:57:16.788081 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kube-api-access\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.788335 master-0 kubenswrapper[7642]: I0319 11:57:16.788119 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-var-lock\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.889217 master-0 kubenswrapper[7642]: I0319 11:57:16.889138 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kube-api-access\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.889551 master-0 kubenswrapper[7642]: I0319 11:57:16.889242 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-var-lock\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.889551 master-0 kubenswrapper[7642]: I0319 11:57:16.889347 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-var-lock\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.889551 master-0 kubenswrapper[7642]: I0319 11:57:16.889414 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.889551 master-0 kubenswrapper[7642]: I0319 11:57:16.889513 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.911472 master-0 kubenswrapper[7642]: I0319 11:57:16.911413 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kube-api-access\") pod \"installer-3-master-0\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:16.920098 master-0 kubenswrapper[7642]: I0319 11:57:16.920057 7642 generic.go:334] "Generic (PLEG): container finished" podID="4d625e92-793e-4942-a826-d30d21e3ca0c" containerID="67dee3634bf8afc5cf33505c6686845e34de0d09fbd040277b4cc0507631dc5b" exitCode=0 Mar 19 11:57:16.920660 master-0 kubenswrapper[7642]: I0319 11:57:16.920130 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerDied","Data":"67dee3634bf8afc5cf33505c6686845e34de0d09fbd040277b4cc0507631dc5b"} Mar 19 11:57:16.920660 master-0 kubenswrapper[7642]: I0319 11:57:16.920165 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerStarted","Data":"fed06e56f643611f5a7965ed4f8766bb3618f49ec311e6c29d1c5641de5af6fe"} Mar 19 11:57:16.923493 master-0 kubenswrapper[7642]: I0319 11:57:16.923437 7642 generic.go:334] "Generic (PLEG): container finished" podID="003e840c-1b07-4624-8225-e4ffdab74213" containerID="0d16f994259c86f779a870061760bfaa564ec639686f9c38a22c5ca129c5726f" exitCode=0 Mar 19 11:57:16.923598 master-0 kubenswrapper[7642]: I0319 11:57:16.923543 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerDied","Data":"0d16f994259c86f779a870061760bfaa564ec639686f9c38a22c5ca129c5726f"} Mar 19 11:57:16.923677 master-0 kubenswrapper[7642]: I0319 11:57:16.923606 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerStarted","Data":"cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241"} Mar 19 11:57:17.092514 master-0 kubenswrapper[7642]: I0319 11:57:17.092336 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:17.536439 master-0 kubenswrapper[7642]: I0319 11:57:17.536389 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 19 11:57:17.541434 master-0 kubenswrapper[7642]: W0319 11:57:17.539741 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod18c0e2ac_b5c3_447d_96b4_37d7d0d733ec.slice/crio-23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561 WatchSource:0}: Error finding container 23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561: Status 404 returned error can't find the container with id 23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561 Mar 19 11:57:17.931777 master-0 kubenswrapper[7642]: I0319 11:57:17.931646 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerStarted","Data":"e2e764dd4d611abac99b99d586297942d72b2e3f1434058cd7e0a11729f21ba8"} Mar 19 11:57:17.933724 master-0 kubenswrapper[7642]: I0319 11:57:17.932895 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec","Type":"ContainerStarted","Data":"882b540d45f96873ea08eb024547ab884cf7ba3eaac3163e6d0d3525a76f6dd5"} Mar 19 11:57:17.933724 master-0 kubenswrapper[7642]: I0319 11:57:17.932919 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec","Type":"ContainerStarted","Data":"23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561"} Mar 19 11:57:17.973008 master-0 kubenswrapper[7642]: I0319 11:57:17.972906 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=1.972877392 podStartE2EDuration="1.972877392s" podCreationTimestamp="2026-03-19 11:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:17.968957667 +0000 UTC m=+196.086398464" watchObservedRunningTime="2026-03-19 11:57:17.972877392 +0000 UTC m=+196.090318199" Mar 19 11:57:18.945119 master-0 kubenswrapper[7642]: I0319 11:57:18.944729 7642 generic.go:334] "Generic (PLEG): container finished" podID="4d625e92-793e-4942-a826-d30d21e3ca0c" containerID="023b8459d54078f874c794a42aed9a6fd6fbb4ddfa6be97808a092ec4ffe3187" exitCode=0 Mar 19 11:57:18.945119 master-0 kubenswrapper[7642]: I0319 11:57:18.945087 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerDied","Data":"023b8459d54078f874c794a42aed9a6fd6fbb4ddfa6be97808a092ec4ffe3187"} Mar 19 11:57:18.979706 master-0 kubenswrapper[7642]: I0319 11:57:18.973942 7642 generic.go:334] "Generic (PLEG): container finished" podID="003e840c-1b07-4624-8225-e4ffdab74213" containerID="e2e764dd4d611abac99b99d586297942d72b2e3f1434058cd7e0a11729f21ba8" exitCode=0 Mar 19 11:57:18.979706 master-0 kubenswrapper[7642]: I0319 11:57:18.975523 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerDied","Data":"e2e764dd4d611abac99b99d586297942d72b2e3f1434058cd7e0a11729f21ba8"} Mar 19 11:57:19.985350 master-0 kubenswrapper[7642]: I0319 11:57:19.985225 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerStarted","Data":"531e1af91168e7c40c25cd45e3b2a8c46d00c5c08a0f607a449f4726fdf3817b"} Mar 19 11:57:19.988572 master-0 kubenswrapper[7642]: I0319 11:57:19.988535 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerStarted","Data":"f7ff1ad801ba3c97ab9cdbd502c5a44b3579c7a5686bfae47cda73c6ece4216d"} Mar 19 11:57:20.017878 master-0 kubenswrapper[7642]: I0319 11:57:20.017678 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tzckk" podStartSLOduration=3.519004591 podStartE2EDuration="6.017655149s" podCreationTimestamp="2026-03-19 11:57:14 +0000 UTC" firstStartedPulling="2026-03-19 11:57:16.925480315 +0000 UTC m=+195.042921102" lastFinishedPulling="2026-03-19 11:57:19.424130873 +0000 UTC m=+197.541571660" observedRunningTime="2026-03-19 11:57:20.015455985 +0000 UTC m=+198.132896782" watchObservedRunningTime="2026-03-19 11:57:20.017655149 +0000 UTC m=+198.135095956" Mar 19 11:57:20.036014 master-0 kubenswrapper[7642]: I0319 11:57:20.035887 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zxlv" podStartSLOduration=3.543170069 podStartE2EDuration="6.035868432s" podCreationTimestamp="2026-03-19 11:57:14 +0000 UTC" firstStartedPulling="2026-03-19 11:57:16.921931131 +0000 UTC m=+195.039371958" lastFinishedPulling="2026-03-19 11:57:19.414629534 +0000 UTC m=+197.532070321" observedRunningTime="2026-03-19 11:57:20.034118561 +0000 UTC m=+198.151559358" watchObservedRunningTime="2026-03-19 11:57:20.035868432 +0000 UTC m=+198.153309219" Mar 19 11:57:20.375709 master-0 kubenswrapper[7642]: I0319 11:57:20.368729 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd"] Mar 19 11:57:20.375709 master-0 kubenswrapper[7642]: I0319 11:57:20.370107 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.375709 master-0 kubenswrapper[7642]: I0319 11:57:20.374929 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x"] Mar 19 11:57:20.376698 master-0 kubenswrapper[7642]: I0319 11:57:20.376201 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.377179 master-0 kubenswrapper[7642]: I0319 11:57:20.377142 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp"] Mar 19 11:57:20.378247 master-0 kubenswrapper[7642]: I0319 11:57:20.378226 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.383849 master-0 kubenswrapper[7642]: I0319 11:57:20.382434 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.383849 master-0 kubenswrapper[7642]: I0319 11:57:20.382515 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7efc158-8153-421f-b4a3-c9202212faee-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.383849 master-0 kubenswrapper[7642]: I0319 11:57:20.382554 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.383849 master-0 kubenswrapper[7642]: I0319 11:57:20.382583 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f7efc158-8153-421f-b4a3-c9202212faee-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.383849 master-0 kubenswrapper[7642]: I0319 11:57:20.382613 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5p9d\" (UniqueName: \"kubernetes.io/projected/f7efc158-8153-421f-b4a3-c9202212faee-kube-api-access-k5p9d\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.387295 master-0 kubenswrapper[7642]: I0319 11:57:20.387240 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-8zr85" Mar 19 11:57:20.387571 master-0 kubenswrapper[7642]: I0319 11:57:20.387551 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:57:20.387760 master-0 kubenswrapper[7642]: I0319 11:57:20.387737 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 11:57:20.387920 master-0 kubenswrapper[7642]: I0319 11:57:20.387901 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 11:57:20.388058 master-0 kubenswrapper[7642]: I0319 11:57:20.388024 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 11:57:20.388204 master-0 kubenswrapper[7642]: I0319 11:57:20.388177 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:57:20.388322 master-0 kubenswrapper[7642]: I0319 11:57:20.388301 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 11:57:20.388417 master-0 kubenswrapper[7642]: I0319 11:57:20.388394 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 11:57:20.388515 master-0 kubenswrapper[7642]: I0319 11:57:20.388494 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 11:57:20.388642 master-0 kubenswrapper[7642]: I0319 11:57:20.388611 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-h5fcf" Mar 19 11:57:20.388939 master-0 kubenswrapper[7642]: I0319 11:57:20.388916 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-qzhdr" Mar 19 11:57:20.391144 master-0 kubenswrapper[7642]: I0319 11:57:20.389047 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 11:57:20.391144 master-0 kubenswrapper[7642]: I0319 11:57:20.389163 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 11:57:20.391144 master-0 kubenswrapper[7642]: I0319 11:57:20.389269 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 11:57:20.391656 master-0 kubenswrapper[7642]: I0319 11:57:20.391560 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 11:57:20.392071 master-0 kubenswrapper[7642]: I0319 11:57:20.392058 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 11:57:20.412109 master-0 kubenswrapper[7642]: I0319 11:57:20.412050 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp"] Mar 19 11:57:20.470234 master-0 kubenswrapper[7642]: I0319 11:57:20.470148 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm"] Mar 19 11:57:20.470952 master-0 kubenswrapper[7642]: I0319 11:57:20.470918 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.475398 master-0 kubenswrapper[7642]: I0319 11:57:20.475335 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-xsv5c" Mar 19 11:57:20.475986 master-0 kubenswrapper[7642]: I0319 11:57:20.475957 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 11:57:20.484503 master-0 kubenswrapper[7642]: I0319 11:57:20.484416 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.484503 master-0 kubenswrapper[7642]: I0319 11:57:20.484479 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.484503 master-0 kubenswrapper[7642]: I0319 11:57:20.484506 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4khz\" (UniqueName: \"kubernetes.io/projected/a5d009a3-5ff2-49f9-9cea-004b99729b77-kube-api-access-z4khz\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484604 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484678 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484743 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7efc158-8153-421f-b4a3-c9202212faee-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484766 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6fj\" (UniqueName: \"kubernetes.io/projected/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-kube-api-access-kk6fj\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484824 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484861 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484884 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f7efc158-8153-421f-b4a3-c9202212faee-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484905 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4xb\" (UniqueName: \"kubernetes.io/projected/3a185e40-e97b-49e9-aea3-5f170401a9e3-kube-api-access-2v4xb\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484947 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-config\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.484977 master-0 kubenswrapper[7642]: I0319 11:57:20.484968 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5p9d\" (UniqueName: \"kubernetes.io/projected/f7efc158-8153-421f-b4a3-c9202212faee-kube-api-access-k5p9d\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.487245 master-0 kubenswrapper[7642]: I0319 11:57:20.486046 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-images\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.487813 master-0 kubenswrapper[7642]: I0319 11:57:20.487729 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.488742 master-0 kubenswrapper[7642]: I0319 11:57:20.488708 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7efc158-8153-421f-b4a3-c9202212faee-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.488817 master-0 kubenswrapper[7642]: I0319 11:57:20.488792 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f7efc158-8153-421f-b4a3-c9202212faee-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.488852 master-0 kubenswrapper[7642]: I0319 11:57:20.488837 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5"] Mar 19 11:57:20.491766 master-0 kubenswrapper[7642]: I0319 11:57:20.489767 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.495067 master-0 kubenswrapper[7642]: I0319 11:57:20.492173 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 11:57:20.495067 master-0 kubenswrapper[7642]: I0319 11:57:20.492356 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm"] Mar 19 11:57:20.495067 master-0 kubenswrapper[7642]: I0319 11:57:20.492758 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 11:57:20.495301 master-0 kubenswrapper[7642]: I0319 11:57:20.495268 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 11:57:20.499660 master-0 kubenswrapper[7642]: I0319 11:57:20.495276 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xm92n" Mar 19 11:57:20.542666 master-0 kubenswrapper[7642]: I0319 11:57:20.530707 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 11:57:20.546229 master-0 kubenswrapper[7642]: I0319 11:57:20.544398 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5"] Mar 19 11:57:20.557828 master-0 kubenswrapper[7642]: I0319 11:57:20.554306 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5p9d\" (UniqueName: \"kubernetes.io/projected/f7efc158-8153-421f-b4a3-c9202212faee-kube-api-access-k5p9d\") pod \"cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.575090 master-0 kubenswrapper[7642]: I0319 11:57:20.575026 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t"] Mar 19 11:57:20.576967 master-0 kubenswrapper[7642]: I0319 11:57:20.576908 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.581214 master-0 kubenswrapper[7642]: I0319 11:57:20.581177 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 11:57:20.581449 master-0 kubenswrapper[7642]: I0319 11:57:20.581425 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 11:57:20.581658 master-0 kubenswrapper[7642]: I0319 11:57:20.581618 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 11:57:20.581746 master-0 kubenswrapper[7642]: I0319 11:57:20.581628 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586263 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586323 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586369 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6fj\" (UniqueName: \"kubernetes.io/projected/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-kube-api-access-kk6fj\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586399 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r97ml\" (UniqueName: \"kubernetes.io/projected/63e81c6a-b723-4c9a-b471-226fefa9c7a7-kube-api-access-r97ml\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586425 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvpz\" (UniqueName: \"kubernetes.io/projected/eb274dd5-e98e-4485-a050-7bec560a0daa-kube-api-access-jsvpz\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586451 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.586482 master-0 kubenswrapper[7642]: I0319 11:57:20.586485 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.587013 master-0 kubenswrapper[7642]: I0319 11:57:20.586512 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.587013 master-0 kubenswrapper[7642]: I0319 11:57:20.586546 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4xb\" (UniqueName: \"kubernetes.io/projected/3a185e40-e97b-49e9-aea3-5f170401a9e3-kube-api-access-2v4xb\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.587013 master-0 kubenswrapper[7642]: I0319 11:57:20.586573 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.587013 master-0 kubenswrapper[7642]: I0319 11:57:20.586609 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-config\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.589621 master-0 kubenswrapper[7642]: I0319 11:57:20.589580 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-config\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.592030 master-0 kubenswrapper[7642]: I0319 11:57:20.591444 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.597296 master-0 kubenswrapper[7642]: I0319 11:57:20.597247 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.597804 master-0 kubenswrapper[7642]: I0319 11:57:20.597752 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.597899 master-0 kubenswrapper[7642]: I0319 11:57:20.597881 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.597948 master-0 kubenswrapper[7642]: I0319 11:57:20.597936 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4khz\" (UniqueName: \"kubernetes.io/projected/a5d009a3-5ff2-49f9-9cea-004b99729b77-kube-api-access-z4khz\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.598285 master-0 kubenswrapper[7642]: I0319 11:57:20.598250 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-auth-proxy-config\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.609015 master-0 kubenswrapper[7642]: I0319 11:57:20.600294 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.609015 master-0 kubenswrapper[7642]: I0319 11:57:20.604409 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2pvrw" Mar 19 11:57:20.609015 master-0 kubenswrapper[7642]: I0319 11:57:20.604648 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 11:57:20.609015 master-0 kubenswrapper[7642]: I0319 11:57:20.608752 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-machine-approver-tls\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.623669 master-0 kubenswrapper[7642]: I0319 11:57:20.618257 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx"] Mar 19 11:57:20.623669 master-0 kubenswrapper[7642]: I0319 11:57:20.619320 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.629740 master-0 kubenswrapper[7642]: I0319 11:57:20.626471 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-k64d2" Mar 19 11:57:20.629740 master-0 kubenswrapper[7642]: I0319 11:57:20.627052 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj"] Mar 19 11:57:20.629740 master-0 kubenswrapper[7642]: I0319 11:57:20.628207 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t"] Mar 19 11:57:20.629740 master-0 kubenswrapper[7642]: I0319 11:57:20.628324 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.633658 master-0 kubenswrapper[7642]: I0319 11:57:20.631493 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 11:57:20.633658 master-0 kubenswrapper[7642]: I0319 11:57:20.632756 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-jn2px"] Mar 19 11:57:20.633658 master-0 kubenswrapper[7642]: I0319 11:57:20.633127 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 11:57:20.634576 master-0 kubenswrapper[7642]: I0319 11:57:20.634131 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.634576 master-0 kubenswrapper[7642]: I0319 11:57:20.634487 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8"] Mar 19 11:57:20.635434 master-0 kubenswrapper[7642]: I0319 11:57:20.635398 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.646682 master-0 kubenswrapper[7642]: I0319 11:57:20.646439 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-gjszs" Mar 19 11:57:20.646787 master-0 kubenswrapper[7642]: I0319 11:57:20.646739 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.646892 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.647018 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.647509 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.647607 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-v6j8n" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.647727 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.647896 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.648047 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-tsgvs" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.648143 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 11:57:20.649898 master-0 kubenswrapper[7642]: I0319 11:57:20.648233 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 11:57:20.662545 master-0 kubenswrapper[7642]: I0319 11:57:20.662490 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4khz\" (UniqueName: \"kubernetes.io/projected/a5d009a3-5ff2-49f9-9cea-004b99729b77-kube-api-access-z4khz\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.676445 master-0 kubenswrapper[7642]: I0319 11:57:20.676385 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-jn2px"] Mar 19 11:57:20.677198 master-0 kubenswrapper[7642]: I0319 11:57:20.677152 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4xb\" (UniqueName: \"kubernetes.io/projected/3a185e40-e97b-49e9-aea3-5f170401a9e3-kube-api-access-2v4xb\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.677374 master-0 kubenswrapper[7642]: I0319 11:57:20.677321 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 11:57:20.682519 master-0 kubenswrapper[7642]: I0319 11:57:20.682462 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj"] Mar 19 11:57:20.694301 master-0 kubenswrapper[7642]: I0319 11:57:20.689284 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6fj\" (UniqueName: \"kubernetes.io/projected/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-kube-api-access-kk6fj\") pod \"machine-approver-6cb57bb5db-47c9x\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.694301 master-0 kubenswrapper[7642]: I0319 11:57:20.694052 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8"] Mar 19 11:57:20.697773 master-0 kubenswrapper[7642]: I0319 11:57:20.696327 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx"] Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700422 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rztqj\" (UniqueName: \"kubernetes.io/projected/0f2278c1-05cf-469c-a13c-762b7a790b38-kube-api-access-rztqj\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700505 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700546 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700570 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700603 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-snapshots\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700709 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700798 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700830 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvp77\" (UniqueName: \"kubernetes.io/projected/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-kube-api-access-rvp77\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700876 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700909 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r97ml\" (UniqueName: \"kubernetes.io/projected/63e81c6a-b723-4c9a-b471-226fefa9c7a7-kube-api-access-r97ml\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700935 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.700968 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvpz\" (UniqueName: \"kubernetes.io/projected/eb274dd5-e98e-4485-a050-7bec560a0daa-kube-api-access-jsvpz\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701000 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701031 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701086 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701114 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701143 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5f65\" (UniqueName: \"kubernetes.io/projected/fbde1a69-b56c-467e-a291-9a83c448346f-kube-api-access-j5f65\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701180 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701231 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701279 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.701722 master-0 kubenswrapper[7642]: I0319 11:57:20.701313 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9g5b\" (UniqueName: \"kubernetes.io/projected/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-kube-api-access-x9g5b\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.702669 master-0 kubenswrapper[7642]: I0319 11:57:20.702327 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.706734 master-0 kubenswrapper[7642]: I0319 11:57:20.703611 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.708753 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.710899 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8"] Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.712007 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.715421 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.715600 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.715682 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.720886 master-0 kubenswrapper[7642]: I0319 11:57:20.718525 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9ww6r" Mar 19 11:57:20.727551 master-0 kubenswrapper[7642]: I0319 11:57:20.726246 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8"] Mar 19 11:57:20.741929 master-0 kubenswrapper[7642]: I0319 11:57:20.741568 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r97ml\" (UniqueName: \"kubernetes.io/projected/63e81c6a-b723-4c9a-b471-226fefa9c7a7-kube-api-access-r97ml\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.767668 master-0 kubenswrapper[7642]: W0319 11:57:20.759122 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7efc158_8153_421f_b4a3_c9202212faee.slice/crio-f16fa2b1bd6447c738ceed72fb3f88e2811bff9396e5490d532024732e61132d WatchSource:0}: Error finding container f16fa2b1bd6447c738ceed72fb3f88e2811bff9396e5490d532024732e61132d: Status 404 returned error can't find the container with id f16fa2b1bd6447c738ceed72fb3f88e2811bff9396e5490d532024732e61132d Mar 19 11:57:20.776676 master-0 kubenswrapper[7642]: I0319 11:57:20.776455 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvpz\" (UniqueName: \"kubernetes.io/projected/eb274dd5-e98e-4485-a050-7bec560a0daa-kube-api-access-jsvpz\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803791 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803838 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803878 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803896 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803922 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803942 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5f65\" (UniqueName: \"kubernetes.io/projected/fbde1a69-b56c-467e-a291-9a83c448346f-kube-api-access-j5f65\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803969 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.803991 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9g5b\" (UniqueName: \"kubernetes.io/projected/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-kube-api-access-x9g5b\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804024 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rztqj\" (UniqueName: \"kubernetes.io/projected/0f2278c1-05cf-469c-a13c-762b7a790b38-kube-api-access-rztqj\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804045 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804484 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-snapshots\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804527 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804549 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804693 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvp77\" (UniqueName: \"kubernetes.io/projected/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-kube-api-access-rvp77\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804721 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhrb\" (UniqueName: \"kubernetes.io/projected/bcc25a4a-521a-4139-a10c-d5b8bba6e359-kube-api-access-2vhrb\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804751 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.804900 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.806969 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.808509 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-snapshots\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.809356 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.809410 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.813668 master-0 kubenswrapper[7642]: I0319 11:57:20.809962 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.824664 master-0 kubenswrapper[7642]: I0319 11:57:20.814908 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.824664 master-0 kubenswrapper[7642]: I0319 11:57:20.819468 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.824664 master-0 kubenswrapper[7642]: I0319 11:57:20.819495 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.824664 master-0 kubenswrapper[7642]: I0319 11:57:20.821387 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.834666 master-0 kubenswrapper[7642]: I0319 11:57:20.829119 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:20.850664 master-0 kubenswrapper[7642]: I0319 11:57:20.842617 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9g5b\" (UniqueName: \"kubernetes.io/projected/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-kube-api-access-x9g5b\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:20.850664 master-0 kubenswrapper[7642]: I0319 11:57:20.843089 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rztqj\" (UniqueName: \"kubernetes.io/projected/0f2278c1-05cf-469c-a13c-762b7a790b38-kube-api-access-rztqj\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:20.850664 master-0 kubenswrapper[7642]: I0319 11:57:20.843843 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvp77\" (UniqueName: \"kubernetes.io/projected/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-kube-api-access-rvp77\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:20.850664 master-0 kubenswrapper[7642]: I0319 11:57:20.846913 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5f65\" (UniqueName: \"kubernetes.io/projected/fbde1a69-b56c-467e-a291-9a83c448346f-kube-api-access-j5f65\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:20.871664 master-0 kubenswrapper[7642]: W0319 11:57:20.868023 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod920d5a32_1dd2_4d45_8319_1aca1ffd48b5.slice/crio-afe4f76bba4ac6cbbf404f5b90d9dfd338af36f437ee01db3628f61ee2387f72 WatchSource:0}: Error finding container afe4f76bba4ac6cbbf404f5b90d9dfd338af36f437ee01db3628f61ee2387f72: Status 404 returned error can't find the container with id afe4f76bba4ac6cbbf404f5b90d9dfd338af36f437ee01db3628f61ee2387f72 Mar 19 11:57:20.893995 master-0 kubenswrapper[7642]: I0319 11:57:20.887211 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 11:57:20.912664 master-0 kubenswrapper[7642]: I0319 11:57:20.910731 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhrb\" (UniqueName: \"kubernetes.io/projected/bcc25a4a-521a-4139-a10c-d5b8bba6e359-kube-api-access-2vhrb\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.912664 master-0 kubenswrapper[7642]: I0319 11:57:20.910789 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.929661 master-0 kubenswrapper[7642]: I0319 11:57:20.927597 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.942019 master-0 kubenswrapper[7642]: I0319 11:57:20.936908 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 11:57:20.968952 master-0 kubenswrapper[7642]: I0319 11:57:20.962593 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhrb\" (UniqueName: \"kubernetes.io/projected/bcc25a4a-521a-4139-a10c-d5b8bba6e359-kube-api-access-2vhrb\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:20.968952 master-0 kubenswrapper[7642]: I0319 11:57:20.963187 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 11:57:20.996795 master-0 kubenswrapper[7642]: I0319 11:57:20.996749 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 11:57:20.998108 master-0 kubenswrapper[7642]: I0319 11:57:20.998054 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerStarted","Data":"f16fa2b1bd6447c738ceed72fb3f88e2811bff9396e5490d532024732e61132d"} Mar 19 11:57:21.022172 master-0 kubenswrapper[7642]: I0319 11:57:21.022061 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 11:57:21.032797 master-0 kubenswrapper[7642]: I0319 11:57:21.032740 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 11:57:21.048289 master-0 kubenswrapper[7642]: I0319 11:57:21.047544 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 11:57:21.048289 master-0 kubenswrapper[7642]: I0319 11:57:21.047850 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" event={"ID":"920d5a32-1dd2-4d45-8319-1aca1ffd48b5","Type":"ContainerStarted","Data":"afe4f76bba4ac6cbbf404f5b90d9dfd338af36f437ee01db3628f61ee2387f72"} Mar 19 11:57:21.074434 master-0 kubenswrapper[7642]: I0319 11:57:21.073932 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 11:57:21.106162 master-0 kubenswrapper[7642]: I0319 11:57:21.106113 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 11:57:21.748117 master-0 kubenswrapper[7642]: I0319 11:57:21.745194 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm"] Mar 19 11:57:21.784227 master-0 kubenswrapper[7642]: I0319 11:57:21.783784 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5"] Mar 19 11:57:21.910725 master-0 kubenswrapper[7642]: I0319 11:57:21.906857 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj"] Mar 19 11:57:21.911233 master-0 kubenswrapper[7642]: I0319 11:57:21.911178 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp"] Mar 19 11:57:21.920895 master-0 kubenswrapper[7642]: I0319 11:57:21.916104 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx"] Mar 19 11:57:21.934730 master-0 kubenswrapper[7642]: W0319 11:57:21.930162 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2278c1_05cf_469c_a13c_762b7a790b38.slice/crio-a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b WatchSource:0}: Error finding container a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b: Status 404 returned error can't find the container with id a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b Mar 19 11:57:22.069700 master-0 kubenswrapper[7642]: I0319 11:57:22.068453 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8"] Mar 19 11:57:22.101800 master-0 kubenswrapper[7642]: W0319 11:57:22.099882 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52981e0_0bc4_4dfa_81d5_aeab63fa23e0.slice/crio-a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442 WatchSource:0}: Error finding container a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442: Status 404 returned error can't find the container with id a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442 Mar 19 11:57:22.113954 master-0 kubenswrapper[7642]: W0319 11:57:22.113901 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc25a4a_521a_4139_a10c_d5b8bba6e359.slice/crio-7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f WatchSource:0}: Error finding container 7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f: Status 404 returned error can't find the container with id 7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137578 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" event={"ID":"fbde1a69-b56c-467e-a291-9a83c448346f","Type":"ContainerStarted","Data":"0c58fd5ea7dfc487f516ab9f02025ffdb09d90a9ab5d62ff3f2b99a062549935"} Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137683 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" event={"ID":"0f2278c1-05cf-469c-a13c-762b7a790b38","Type":"ContainerStarted","Data":"a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b"} Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137703 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" event={"ID":"a5d009a3-5ff2-49f9-9cea-004b99729b77","Type":"ContainerStarted","Data":"414a48d3cf2977fd972a56aeae2697291bc4403c105ce776bfadef125a18facb"} Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137714 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" event={"ID":"63e81c6a-b723-4c9a-b471-226fefa9c7a7","Type":"ContainerStarted","Data":"2ccf8a3de86952995f531b01f4ffdc152c7bc1633eadc15477217fefaa028047"} Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137726 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-68bf6ff9d6-jn2px"] Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137744 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8"] Mar 19 11:57:22.141620 master-0 kubenswrapper[7642]: I0319 11:57:22.137756 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t"] Mar 19 11:57:22.161588 master-0 kubenswrapper[7642]: I0319 11:57:22.157884 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" event={"ID":"920d5a32-1dd2-4d45-8319-1aca1ffd48b5","Type":"ContainerStarted","Data":"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce"} Mar 19 11:57:23.159070 master-0 kubenswrapper[7642]: I0319 11:57:23.159000 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" event={"ID":"0f2278c1-05cf-469c-a13c-762b7a790b38","Type":"ContainerStarted","Data":"553238f28fef1cda6c72bfa4bf9821b26a728290573ff1064951e4a70d55ddbf"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.160065 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" event={"ID":"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf","Type":"ContainerStarted","Data":"f233927f2993af718960afb2ae363b672383e871d2174bdf0f0ce3e81e24474f"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.160990 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" event={"ID":"3a185e40-e97b-49e9-aea3-5f170401a9e3","Type":"ContainerStarted","Data":"d34c10b1c43979eaba12a772c86e5b04cdb550f06f5159184d7c1b781b1802f8"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.162198 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" event={"ID":"63e81c6a-b723-4c9a-b471-226fefa9c7a7","Type":"ContainerStarted","Data":"df5e3a3b450407477b1d624e52a1245a526ec5f6a1d53e96d8498482a8ee4d97"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.163800 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" event={"ID":"bcc25a4a-521a-4139-a10c-d5b8bba6e359","Type":"ContainerStarted","Data":"0bd48d8d40465277e592ebfacd5396ad3f4989dcc6911b25f2efff378d12a261"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.163823 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" event={"ID":"bcc25a4a-521a-4139-a10c-d5b8bba6e359","Type":"ContainerStarted","Data":"7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.165311 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerStarted","Data":"a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.167127 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" event={"ID":"fbde1a69-b56c-467e-a291-9a83c448346f","Type":"ContainerStarted","Data":"d9f2dee995f42cd5f80f8d3478e2f0c9d499af83bd740001cdc2fee3f2f3dbde"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.169153 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"145e950b874609dccdd455c8b407e894b0b805b780d645f8b5cd83d4f2e5ddb0"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.169173 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"ecee444cb43be9a1ddd0e2f81e61eca30a1f71fa25000dc8d0f2c58cd47a25c6"} Mar 19 11:57:23.363465 master-0 kubenswrapper[7642]: I0319 11:57:23.169182 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"854a387c5ef79581e793cea9af478d1025e855bc5db72359436a67d2aec956ce"} Mar 19 11:57:23.626839 master-0 kubenswrapper[7642]: I0319 11:57:23.626699 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" podStartSLOduration=3.626657098 podStartE2EDuration="3.626657098s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:23.558009496 +0000 UTC m=+201.675450293" watchObservedRunningTime="2026-03-19 11:57:23.626657098 +0000 UTC m=+201.744097885" Mar 19 11:57:24.051758 master-0 kubenswrapper[7642]: I0319 11:57:24.047663 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg"] Mar 19 11:57:24.051758 master-0 kubenswrapper[7642]: I0319 11:57:24.048626 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.052161 master-0 kubenswrapper[7642]: I0319 11:57:24.052066 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zgw2m" Mar 19 11:57:24.053022 master-0 kubenswrapper[7642]: I0319 11:57:24.052426 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 11:57:24.120081 master-0 kubenswrapper[7642]: I0319 11:57:24.118721 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg"] Mar 19 11:57:24.125074 master-0 kubenswrapper[7642]: I0319 11:57:24.125021 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.125135 master-0 kubenswrapper[7642]: I0319 11:57:24.125105 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-tmpfs\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.125467 master-0 kubenswrapper[7642]: I0319 11:57:24.125405 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.125512 master-0 kubenswrapper[7642]: I0319 11:57:24.125483 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8765q\" (UniqueName: \"kubernetes.io/projected/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-kube-api-access-8765q\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.192469 master-0 kubenswrapper[7642]: I0319 11:57:24.192395 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" event={"ID":"bcc25a4a-521a-4139-a10c-d5b8bba6e359","Type":"ContainerStarted","Data":"59e44e1482a20a989757eaeac24f590e16924fc1f22b5b1d84ceef232530f2b0"} Mar 19 11:57:24.228458 master-0 kubenswrapper[7642]: I0319 11:57:24.228337 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-tmpfs\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.228458 master-0 kubenswrapper[7642]: I0319 11:57:24.228476 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.228458 master-0 kubenswrapper[7642]: I0319 11:57:24.228511 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8765q\" (UniqueName: \"kubernetes.io/projected/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-kube-api-access-8765q\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.228807 master-0 kubenswrapper[7642]: I0319 11:57:24.228601 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.229541 master-0 kubenswrapper[7642]: I0319 11:57:24.229481 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-tmpfs\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.231283 master-0 kubenswrapper[7642]: I0319 11:57:24.229800 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" podStartSLOduration=4.229781686 podStartE2EDuration="4.229781686s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:24.212591742 +0000 UTC m=+202.330032529" watchObservedRunningTime="2026-03-19 11:57:24.229781686 +0000 UTC m=+202.347222473" Mar 19 11:57:24.237249 master-0 kubenswrapper[7642]: I0319 11:57:24.237206 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.239553 master-0 kubenswrapper[7642]: I0319 11:57:24.239523 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.245394 master-0 kubenswrapper[7642]: I0319 11:57:24.245369 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8765q\" (UniqueName: \"kubernetes.io/projected/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-kube-api-access-8765q\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:24.255752 master-0 kubenswrapper[7642]: I0319 11:57:24.255502 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn"] Mar 19 11:57:24.256214 master-0 kubenswrapper[7642]: I0319 11:57:24.256010 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="multus-admission-controller" containerID="cri-o://7df327773d76aaacf805979906c8e4aa2f22ce6b1887fb5d42df0fd26e17a8b7" gracePeriod=30 Mar 19 11:57:24.256592 master-0 kubenswrapper[7642]: I0319 11:57:24.256462 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="kube-rbac-proxy" containerID="cri-o://a6e72fa780304dc2b33dbdfd17546cbc3d37632700728a439fd482435edf60d1" gracePeriod=30 Mar 19 11:57:24.388719 master-0 kubenswrapper[7642]: I0319 11:57:24.381830 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:25.110251 master-0 kubenswrapper[7642]: I0319 11:57:25.108201 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg"] Mar 19 11:57:25.110251 master-0 kubenswrapper[7642]: W0319 11:57:25.108883 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb69a314d_4884_4f2c_a48e_1cf68c5aaef4.slice/crio-107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6 WatchSource:0}: Error finding container 107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6: Status 404 returned error can't find the container with id 107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6 Mar 19 11:57:25.209325 master-0 kubenswrapper[7642]: I0319 11:57:25.209253 7642 generic.go:334] "Generic (PLEG): container finished" podID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerID="a6e72fa780304dc2b33dbdfd17546cbc3d37632700728a439fd482435edf60d1" exitCode=0 Mar 19 11:57:25.210079 master-0 kubenswrapper[7642]: I0319 11:57:25.209362 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" event={"ID":"5ae2d717-506b-4d8e-803b-689c1cc3557c","Type":"ContainerDied","Data":"a6e72fa780304dc2b33dbdfd17546cbc3d37632700728a439fd482435edf60d1"} Mar 19 11:57:25.217234 master-0 kubenswrapper[7642]: I0319 11:57:25.217163 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" event={"ID":"b69a314d-4884-4f2c-a48e-1cf68c5aaef4","Type":"ContainerStarted","Data":"107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6"} Mar 19 11:57:26.236326 master-0 kubenswrapper[7642]: I0319 11:57:26.235607 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" event={"ID":"b69a314d-4884-4f2c-a48e-1cf68c5aaef4","Type":"ContainerStarted","Data":"2ad7b4f8961ebed8d5885645075f44c80d0763532cb9f34aad1f7b43a34d146f"} Mar 19 11:57:26.236326 master-0 kubenswrapper[7642]: I0319 11:57:26.236240 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:26.250099 master-0 kubenswrapper[7642]: I0319 11:57:26.249645 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:26.250099 master-0 kubenswrapper[7642]: I0319 11:57:26.249707 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:26.267653 master-0 kubenswrapper[7642]: I0319 11:57:26.266564 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:26.267653 master-0 kubenswrapper[7642]: I0319 11:57:26.266587 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" podStartSLOduration=2.2665656690000002 podStartE2EDuration="2.266565669s" podCreationTimestamp="2026-03-19 11:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:26.265157248 +0000 UTC m=+204.382598035" watchObservedRunningTime="2026-03-19 11:57:26.266565669 +0000 UTC m=+204.384006456" Mar 19 11:57:26.267653 master-0 kubenswrapper[7642]: I0319 11:57:26.266629 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:26.300283 master-0 kubenswrapper[7642]: I0319 11:57:26.300209 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 11:57:26.322813 master-0 kubenswrapper[7642]: I0319 11:57:26.321102 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:26.375299 master-0 kubenswrapper[7642]: I0319 11:57:26.372833 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:26.392466 master-0 kubenswrapper[7642]: I0319 11:57:26.391606 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tmgtz"] Mar 19 11:57:26.399877 master-0 kubenswrapper[7642]: I0319 11:57:26.399222 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.401462 master-0 kubenswrapper[7642]: I0319 11:57:26.401120 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-pshw6" Mar 19 11:57:26.404039 master-0 kubenswrapper[7642]: I0319 11:57:26.402037 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 11:57:26.485196 master-0 kubenswrapper[7642]: I0319 11:57:26.485121 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.485196 master-0 kubenswrapper[7642]: I0319 11:57:26.485194 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.485509 master-0 kubenswrapper[7642]: I0319 11:57:26.485218 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn94k\" (UniqueName: \"kubernetes.io/projected/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-kube-api-access-rn94k\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.485509 master-0 kubenswrapper[7642]: I0319 11:57:26.485254 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-rootfs\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.589430 master-0 kubenswrapper[7642]: I0319 11:57:26.588422 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.589430 master-0 kubenswrapper[7642]: I0319 11:57:26.588511 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.589430 master-0 kubenswrapper[7642]: I0319 11:57:26.588547 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn94k\" (UniqueName: \"kubernetes.io/projected/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-kube-api-access-rn94k\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.589430 master-0 kubenswrapper[7642]: I0319 11:57:26.588598 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-rootfs\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.589430 master-0 kubenswrapper[7642]: I0319 11:57:26.588722 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-rootfs\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.592824 master-0 kubenswrapper[7642]: I0319 11:57:26.592701 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.595185 master-0 kubenswrapper[7642]: I0319 11:57:26.595091 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.638102 master-0 kubenswrapper[7642]: I0319 11:57:26.637964 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn94k\" (UniqueName: \"kubernetes.io/projected/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-kube-api-access-rn94k\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:26.734587 master-0 kubenswrapper[7642]: I0319 11:57:26.734509 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 11:57:27.306295 master-0 kubenswrapper[7642]: I0319 11:57:27.304692 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tzckk" Mar 19 11:57:27.329980 master-0 kubenswrapper[7642]: I0319 11:57:27.329897 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 11:57:38.553879 master-0 kubenswrapper[7642]: I0319 11:57:38.553793 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x"] Mar 19 11:57:46.090095 master-0 kubenswrapper[7642]: I0319 11:57:46.090024 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd"] Mar 19 11:57:46.351342 master-0 kubenswrapper[7642]: W0319 11:57:46.351279 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f356b2d_6d9e_4ec5_bab5_9bb48322a6c0.slice/crio-0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea WatchSource:0}: Error finding container 0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea: Status 404 returned error can't find the container with id 0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea Mar 19 11:57:46.412727 master-0 kubenswrapper[7642]: I0319 11:57:46.412670 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" event={"ID":"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0","Type":"ContainerStarted","Data":"0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea"} Mar 19 11:57:47.475501 master-0 kubenswrapper[7642]: I0319 11:57:47.473166 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" event={"ID":"fbde1a69-b56c-467e-a291-9a83c448346f","Type":"ContainerStarted","Data":"5e1c65e27dc3f0a6a757a9d44783d533d108753111fcd05bc2b04cd4929ec893"} Mar 19 11:57:47.479991 master-0 kubenswrapper[7642]: I0319 11:57:47.477379 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" event={"ID":"a5d009a3-5ff2-49f9-9cea-004b99729b77","Type":"ContainerStarted","Data":"7181b8b01999fa52cdbfd11c75a2b16463e9e25a0cd9e66237c938d0d2ec42d5"} Mar 19 11:57:47.483164 master-0 kubenswrapper[7642]: I0319 11:57:47.480421 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" event={"ID":"3a185e40-e97b-49e9-aea3-5f170401a9e3","Type":"ContainerStarted","Data":"0e2ded69012d6d01b6db55fbbd61785c43491f5fa298adf54541882050773b91"} Mar 19 11:57:47.483164 master-0 kubenswrapper[7642]: I0319 11:57:47.480575 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" event={"ID":"3a185e40-e97b-49e9-aea3-5f170401a9e3","Type":"ContainerStarted","Data":"bdda154b1c9b78e1ad36a436d43e4b7ab0a54ce3c77530fd643cb895217936f1"} Mar 19 11:57:47.487666 master-0 kubenswrapper[7642]: I0319 11:57:47.484256 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" event={"ID":"63e81c6a-b723-4c9a-b471-226fefa9c7a7","Type":"ContainerStarted","Data":"27a59a5a4197676669c34f3133e3dc3f21912e88cbaa5a4cb220db22fdb857c5"} Mar 19 11:57:47.489391 master-0 kubenswrapper[7642]: I0319 11:57:47.489328 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" event={"ID":"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0","Type":"ContainerStarted","Data":"ca4a26017dcd0ac12df116562f9aac9730227426618d755dab28005b1e97a4e7"} Mar 19 11:57:47.489482 master-0 kubenswrapper[7642]: I0319 11:57:47.489398 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" event={"ID":"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0","Type":"ContainerStarted","Data":"2d1796d1dcba5c87f0e3bf9e86d786a2f91769474b390114a0045bbf4895110f"} Mar 19 11:57:47.491043 master-0 kubenswrapper[7642]: I0319 11:57:47.490982 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerStarted","Data":"064b176871fcbc879687ac7e9904eb1df9dbafb853b4d8ce41a9bbb9767a5a2c"} Mar 19 11:57:47.497743 master-0 kubenswrapper[7642]: I0319 11:57:47.497619 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerStarted","Data":"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a"} Mar 19 11:57:47.497743 master-0 kubenswrapper[7642]: I0319 11:57:47.497751 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerStarted","Data":"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b"} Mar 19 11:57:47.513268 master-0 kubenswrapper[7642]: I0319 11:57:47.512062 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" event={"ID":"0f2278c1-05cf-469c-a13c-762b7a790b38","Type":"ContainerStarted","Data":"4cbcc11104b1dd8e7e5c1754bbcb53f46a0043fca70876d066f039a1531022a4"} Mar 19 11:57:47.524669 master-0 kubenswrapper[7642]: I0319 11:57:47.521182 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" event={"ID":"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf","Type":"ContainerStarted","Data":"b7626e54b5b928b45b3e5c53bd0e84b9d119a3c6eb3a52b158926e0cf76cc9ef"} Mar 19 11:57:47.524669 master-0 kubenswrapper[7642]: I0319 11:57:47.523718 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" event={"ID":"920d5a32-1dd2-4d45-8319-1aca1ffd48b5","Type":"ContainerStarted","Data":"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1"} Mar 19 11:57:47.524669 master-0 kubenswrapper[7642]: I0319 11:57:47.523840 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="kube-rbac-proxy" containerID="cri-o://092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce" gracePeriod=30 Mar 19 11:57:47.524669 master-0 kubenswrapper[7642]: I0319 11:57:47.524011 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="machine-approver-controller" containerID="cri-o://e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1" gracePeriod=30 Mar 19 11:57:47.529661 master-0 kubenswrapper[7642]: I0319 11:57:47.527099 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" podStartSLOduration=3.241230376 podStartE2EDuration="27.52705562s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:22.200276158 +0000 UTC m=+200.317716945" lastFinishedPulling="2026-03-19 11:57:46.486101402 +0000 UTC m=+224.603542189" observedRunningTime="2026-03-19 11:57:47.513281796 +0000 UTC m=+225.630722603" watchObservedRunningTime="2026-03-19 11:57:47.52705562 +0000 UTC m=+225.644496407" Mar 19 11:57:47.560778 master-0 kubenswrapper[7642]: I0319 11:57:47.555396 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" podStartSLOduration=3.293249431 podStartE2EDuration="27.55536557s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:22.139482375 +0000 UTC m=+200.256923162" lastFinishedPulling="2026-03-19 11:57:46.401598514 +0000 UTC m=+224.519039301" observedRunningTime="2026-03-19 11:57:47.551507827 +0000 UTC m=+225.668948614" watchObservedRunningTime="2026-03-19 11:57:47.55536557 +0000 UTC m=+225.672806357" Mar 19 11:57:47.598897 master-0 kubenswrapper[7642]: I0319 11:57:47.592994 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" podStartSLOduration=3.380144289 podStartE2EDuration="27.592973273s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:22.192818949 +0000 UTC m=+200.310259736" lastFinishedPulling="2026-03-19 11:57:46.405647933 +0000 UTC m=+224.523088720" observedRunningTime="2026-03-19 11:57:47.590957494 +0000 UTC m=+225.708398281" watchObservedRunningTime="2026-03-19 11:57:47.592973273 +0000 UTC m=+225.710414060" Mar 19 11:57:47.643726 master-0 kubenswrapper[7642]: I0319 11:57:47.643309 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" podStartSLOduration=3.4905858690000002 podStartE2EDuration="27.643289289s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:22.191349266 +0000 UTC m=+200.308790053" lastFinishedPulling="2026-03-19 11:57:46.344052676 +0000 UTC m=+224.461493473" observedRunningTime="2026-03-19 11:57:47.642431874 +0000 UTC m=+225.759872661" watchObservedRunningTime="2026-03-19 11:57:47.643289289 +0000 UTC m=+225.760730066" Mar 19 11:57:47.675882 master-0 kubenswrapper[7642]: I0319 11:57:47.674964 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" podStartSLOduration=21.674940707 podStartE2EDuration="21.674940707s" podCreationTimestamp="2026-03-19 11:57:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:47.671255899 +0000 UTC m=+225.788696706" watchObservedRunningTime="2026-03-19 11:57:47.674940707 +0000 UTC m=+225.792381494" Mar 19 11:57:47.718695 master-0 kubenswrapper[7642]: I0319 11:57:47.714594 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" podStartSLOduration=3.088649051 podStartE2EDuration="27.714563199s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:21.77991541 +0000 UTC m=+199.897356187" lastFinishedPulling="2026-03-19 11:57:46.405829548 +0000 UTC m=+224.523270335" observedRunningTime="2026-03-19 11:57:47.709074578 +0000 UTC m=+225.826515375" watchObservedRunningTime="2026-03-19 11:57:47.714563199 +0000 UTC m=+225.832003986" Mar 19 11:57:47.718695 master-0 kubenswrapper[7642]: I0319 11:57:47.717431 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:47.766317 master-0 kubenswrapper[7642]: I0319 11:57:47.762489 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" podStartSLOduration=3.54963278 podStartE2EDuration="27.762372751s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:22.192480899 +0000 UTC m=+200.309921686" lastFinishedPulling="2026-03-19 11:57:46.40522087 +0000 UTC m=+224.522661657" observedRunningTime="2026-03-19 11:57:47.754070908 +0000 UTC m=+225.871511705" watchObservedRunningTime="2026-03-19 11:57:47.762372751 +0000 UTC m=+225.879813538" Mar 19 11:57:47.791388 master-0 kubenswrapper[7642]: I0319 11:57:47.785976 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" podStartSLOduration=3.580676961 podStartE2EDuration="27.785952083s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:22.140717051 +0000 UTC m=+200.258157848" lastFinishedPulling="2026-03-19 11:57:46.345992183 +0000 UTC m=+224.463432970" observedRunningTime="2026-03-19 11:57:47.784084348 +0000 UTC m=+225.901525145" watchObservedRunningTime="2026-03-19 11:57:47.785952083 +0000 UTC m=+225.903392870" Mar 19 11:57:47.811664 master-0 kubenswrapper[7642]: I0319 11:57:47.806488 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-config\") pod \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " Mar 19 11:57:47.811664 master-0 kubenswrapper[7642]: I0319 11:57:47.806686 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-machine-approver-tls\") pod \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " Mar 19 11:57:47.811664 master-0 kubenswrapper[7642]: I0319 11:57:47.806743 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-auth-proxy-config\") pod \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " Mar 19 11:57:47.811664 master-0 kubenswrapper[7642]: I0319 11:57:47.806784 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk6fj\" (UniqueName: \"kubernetes.io/projected/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-kube-api-access-kk6fj\") pod \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\" (UID: \"920d5a32-1dd2-4d45-8319-1aca1ffd48b5\") " Mar 19 11:57:47.811664 master-0 kubenswrapper[7642]: I0319 11:57:47.808370 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-config" (OuterVolumeSpecName: "config") pod "920d5a32-1dd2-4d45-8319-1aca1ffd48b5" (UID: "920d5a32-1dd2-4d45-8319-1aca1ffd48b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:57:47.811990 master-0 kubenswrapper[7642]: I0319 11:57:47.811879 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "920d5a32-1dd2-4d45-8319-1aca1ffd48b5" (UID: "920d5a32-1dd2-4d45-8319-1aca1ffd48b5"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:57:47.826364 master-0 kubenswrapper[7642]: I0319 11:57:47.826298 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "920d5a32-1dd2-4d45-8319-1aca1ffd48b5" (UID: "920d5a32-1dd2-4d45-8319-1aca1ffd48b5"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:57:47.826467 master-0 kubenswrapper[7642]: I0319 11:57:47.826432 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-kube-api-access-kk6fj" (OuterVolumeSpecName: "kube-api-access-kk6fj") pod "920d5a32-1dd2-4d45-8319-1aca1ffd48b5" (UID: "920d5a32-1dd2-4d45-8319-1aca1ffd48b5"). InnerVolumeSpecName "kube-api-access-kk6fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:47.909128 master-0 kubenswrapper[7642]: I0319 11:57:47.908345 7642 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:47.909128 master-0 kubenswrapper[7642]: I0319 11:57:47.908426 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk6fj\" (UniqueName: \"kubernetes.io/projected/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-kube-api-access-kk6fj\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:47.909128 master-0 kubenswrapper[7642]: I0319 11:57:47.908444 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:47.909128 master-0 kubenswrapper[7642]: I0319 11:57:47.908457 7642 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/920d5a32-1dd2-4d45-8319-1aca1ffd48b5-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:48.530582 master-0 kubenswrapper[7642]: I0319 11:57:48.530520 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerStarted","Data":"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715"} Mar 19 11:57:48.531468 master-0 kubenswrapper[7642]: I0319 11:57:48.530743 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="cluster-cloud-controller-manager" containerID="cri-o://75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" gracePeriod=30 Mar 19 11:57:48.531468 master-0 kubenswrapper[7642]: I0319 11:57:48.531409 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="kube-rbac-proxy" containerID="cri-o://e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" gracePeriod=30 Mar 19 11:57:48.531468 master-0 kubenswrapper[7642]: I0319 11:57:48.531461 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="config-sync-controllers" containerID="cri-o://b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" gracePeriod=30 Mar 19 11:57:48.534936 master-0 kubenswrapper[7642]: I0319 11:57:48.534901 7642 generic.go:334] "Generic (PLEG): container finished" podID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerID="e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1" exitCode=0 Mar 19 11:57:48.534936 master-0 kubenswrapper[7642]: I0319 11:57:48.534927 7642 generic.go:334] "Generic (PLEG): container finished" podID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerID="092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce" exitCode=0 Mar 19 11:57:48.535578 master-0 kubenswrapper[7642]: I0319 11:57:48.535555 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" Mar 19 11:57:48.536047 master-0 kubenswrapper[7642]: I0319 11:57:48.535957 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" event={"ID":"920d5a32-1dd2-4d45-8319-1aca1ffd48b5","Type":"ContainerDied","Data":"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1"} Mar 19 11:57:48.536047 master-0 kubenswrapper[7642]: I0319 11:57:48.535989 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" event={"ID":"920d5a32-1dd2-4d45-8319-1aca1ffd48b5","Type":"ContainerDied","Data":"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce"} Mar 19 11:57:48.536047 master-0 kubenswrapper[7642]: I0319 11:57:48.536000 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x" event={"ID":"920d5a32-1dd2-4d45-8319-1aca1ffd48b5","Type":"ContainerDied","Data":"afe4f76bba4ac6cbbf404f5b90d9dfd338af36f437ee01db3628f61ee2387f72"} Mar 19 11:57:48.536047 master-0 kubenswrapper[7642]: I0319 11:57:48.536020 7642 scope.go:117] "RemoveContainer" containerID="e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1" Mar 19 11:57:48.554988 master-0 kubenswrapper[7642]: I0319 11:57:48.554945 7642 scope.go:117] "RemoveContainer" containerID="092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce" Mar 19 11:57:48.570133 master-0 kubenswrapper[7642]: I0319 11:57:48.570065 7642 scope.go:117] "RemoveContainer" containerID="e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1" Mar 19 11:57:48.570624 master-0 kubenswrapper[7642]: E0319 11:57:48.570588 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1\": container with ID starting with e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1 not found: ID does not exist" containerID="e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1" Mar 19 11:57:48.570717 master-0 kubenswrapper[7642]: I0319 11:57:48.570679 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1"} err="failed to get container status \"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1\": rpc error: code = NotFound desc = could not find container \"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1\": container with ID starting with e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1 not found: ID does not exist" Mar 19 11:57:48.570717 master-0 kubenswrapper[7642]: I0319 11:57:48.570702 7642 scope.go:117] "RemoveContainer" containerID="092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce" Mar 19 11:57:48.570931 master-0 kubenswrapper[7642]: E0319 11:57:48.570899 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce\": container with ID starting with 092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce not found: ID does not exist" containerID="092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce" Mar 19 11:57:48.570931 master-0 kubenswrapper[7642]: I0319 11:57:48.570924 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce"} err="failed to get container status \"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce\": rpc error: code = NotFound desc = could not find container \"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce\": container with ID starting with 092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce not found: ID does not exist" Mar 19 11:57:48.571003 master-0 kubenswrapper[7642]: I0319 11:57:48.570939 7642 scope.go:117] "RemoveContainer" containerID="e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1" Mar 19 11:57:48.571183 master-0 kubenswrapper[7642]: I0319 11:57:48.571146 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1"} err="failed to get container status \"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1\": rpc error: code = NotFound desc = could not find container \"e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1\": container with ID starting with e615f017d5e7abca9c17903041980238c33b89534f26d7b17b68355fbdd91fe1 not found: ID does not exist" Mar 19 11:57:48.571254 master-0 kubenswrapper[7642]: I0319 11:57:48.571187 7642 scope.go:117] "RemoveContainer" containerID="092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce" Mar 19 11:57:48.571481 master-0 kubenswrapper[7642]: I0319 11:57:48.571444 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce"} err="failed to get container status \"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce\": rpc error: code = NotFound desc = could not find container \"092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce\": container with ID starting with 092c388699ba6d08eef68e50d3ad5a8022ce2044a68d6ffe95a4a60b020d4fce not found: ID does not exist" Mar 19 11:57:48.624987 master-0 kubenswrapper[7642]: I0319 11:57:48.624902 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" podStartSLOduration=2.986089604 podStartE2EDuration="28.624876646s" podCreationTimestamp="2026-03-19 11:57:20 +0000 UTC" firstStartedPulling="2026-03-19 11:57:20.764386568 +0000 UTC m=+198.881827355" lastFinishedPulling="2026-03-19 11:57:46.40317361 +0000 UTC m=+224.520614397" observedRunningTime="2026-03-19 11:57:48.617564382 +0000 UTC m=+226.735005189" watchObservedRunningTime="2026-03-19 11:57:48.624876646 +0000 UTC m=+226.742317433" Mar 19 11:57:48.660596 master-0 kubenswrapper[7642]: I0319 11:57:48.660535 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x"] Mar 19 11:57:48.690270 master-0 kubenswrapper[7642]: I0319 11:57:48.690183 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-6cb57bb5db-47c9x"] Mar 19 11:57:48.725048 master-0 kubenswrapper[7642]: I0319 11:57:48.725000 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:48.787504 master-0 kubenswrapper[7642]: I0319 11:57:48.786615 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m"] Mar 19 11:57:48.790570 master-0 kubenswrapper[7642]: E0319 11:57:48.788024 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="machine-approver-controller" Mar 19 11:57:48.790794 master-0 kubenswrapper[7642]: I0319 11:57:48.790776 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="machine-approver-controller" Mar 19 11:57:48.790867 master-0 kubenswrapper[7642]: E0319 11:57:48.790857 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="kube-rbac-proxy" Mar 19 11:57:48.790918 master-0 kubenswrapper[7642]: I0319 11:57:48.790909 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="kube-rbac-proxy" Mar 19 11:57:48.790983 master-0 kubenswrapper[7642]: E0319 11:57:48.790973 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="config-sync-controllers" Mar 19 11:57:48.791032 master-0 kubenswrapper[7642]: I0319 11:57:48.791023 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="config-sync-controllers" Mar 19 11:57:48.791175 master-0 kubenswrapper[7642]: E0319 11:57:48.791165 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="kube-rbac-proxy" Mar 19 11:57:48.791250 master-0 kubenswrapper[7642]: I0319 11:57:48.791238 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="kube-rbac-proxy" Mar 19 11:57:48.791358 master-0 kubenswrapper[7642]: E0319 11:57:48.791341 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="cluster-cloud-controller-manager" Mar 19 11:57:48.791424 master-0 kubenswrapper[7642]: I0319 11:57:48.791414 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="cluster-cloud-controller-manager" Mar 19 11:57:48.791653 master-0 kubenswrapper[7642]: I0319 11:57:48.791607 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="config-sync-controllers" Mar 19 11:57:48.791728 master-0 kubenswrapper[7642]: I0319 11:57:48.791718 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="kube-rbac-proxy" Mar 19 11:57:48.791798 master-0 kubenswrapper[7642]: I0319 11:57:48.791789 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="cluster-cloud-controller-manager" Mar 19 11:57:48.791853 master-0 kubenswrapper[7642]: I0319 11:57:48.791843 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" containerName="machine-approver-controller" Mar 19 11:57:48.791909 master-0 kubenswrapper[7642]: I0319 11:57:48.791901 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7efc158-8153-421f-b4a3-c9202212faee" containerName="kube-rbac-proxy" Mar 19 11:57:48.792710 master-0 kubenswrapper[7642]: I0319 11:57:48.792692 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:48.795343 master-0 kubenswrapper[7642]: I0319 11:57:48.794866 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-h5fcf" Mar 19 11:57:48.795343 master-0 kubenswrapper[7642]: I0319 11:57:48.795290 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 11:57:48.797448 master-0 kubenswrapper[7642]: I0319 11:57:48.797425 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 11:57:48.798062 master-0 kubenswrapper[7642]: I0319 11:57:48.797768 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 11:57:48.798062 master-0 kubenswrapper[7642]: I0319 11:57:48.797769 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 11:57:48.798062 master-0 kubenswrapper[7642]: I0319 11:57:48.797957 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 11:57:48.823250 master-0 kubenswrapper[7642]: I0319 11:57:48.822730 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-auth-proxy-config\") pod \"f7efc158-8153-421f-b4a3-c9202212faee\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " Mar 19 11:57:48.823250 master-0 kubenswrapper[7642]: I0319 11:57:48.822928 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5p9d\" (UniqueName: \"kubernetes.io/projected/f7efc158-8153-421f-b4a3-c9202212faee-kube-api-access-k5p9d\") pod \"f7efc158-8153-421f-b4a3-c9202212faee\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " Mar 19 11:57:48.823250 master-0 kubenswrapper[7642]: I0319 11:57:48.822956 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-images\") pod \"f7efc158-8153-421f-b4a3-c9202212faee\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " Mar 19 11:57:48.823250 master-0 kubenswrapper[7642]: I0319 11:57:48.823003 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f7efc158-8153-421f-b4a3-c9202212faee-host-etc-kube\") pod \"f7efc158-8153-421f-b4a3-c9202212faee\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " Mar 19 11:57:48.823250 master-0 kubenswrapper[7642]: I0319 11:57:48.823041 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7efc158-8153-421f-b4a3-c9202212faee-cloud-controller-manager-operator-tls\") pod \"f7efc158-8153-421f-b4a3-c9202212faee\" (UID: \"f7efc158-8153-421f-b4a3-c9202212faee\") " Mar 19 11:57:48.824164 master-0 kubenswrapper[7642]: I0319 11:57:48.823857 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7efc158-8153-421f-b4a3-c9202212faee-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "f7efc158-8153-421f-b4a3-c9202212faee" (UID: "f7efc158-8153-421f-b4a3-c9202212faee"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:48.824865 master-0 kubenswrapper[7642]: I0319 11:57:48.824704 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "f7efc158-8153-421f-b4a3-c9202212faee" (UID: "f7efc158-8153-421f-b4a3-c9202212faee"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:57:48.824865 master-0 kubenswrapper[7642]: I0319 11:57:48.824773 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-images" (OuterVolumeSpecName: "images") pod "f7efc158-8153-421f-b4a3-c9202212faee" (UID: "f7efc158-8153-421f-b4a3-c9202212faee"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:57:48.827468 master-0 kubenswrapper[7642]: I0319 11:57:48.827427 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7efc158-8153-421f-b4a3-c9202212faee-kube-api-access-k5p9d" (OuterVolumeSpecName: "kube-api-access-k5p9d") pod "f7efc158-8153-421f-b4a3-c9202212faee" (UID: "f7efc158-8153-421f-b4a3-c9202212faee"). InnerVolumeSpecName "kube-api-access-k5p9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:48.827664 master-0 kubenswrapper[7642]: I0319 11:57:48.827600 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7efc158-8153-421f-b4a3-c9202212faee-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "f7efc158-8153-421f-b4a3-c9202212faee" (UID: "f7efc158-8153-421f-b4a3-c9202212faee"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:57:48.924559 master-0 kubenswrapper[7642]: I0319 11:57:48.924489 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5cc5\" (UniqueName: \"kubernetes.io/projected/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-kube-api-access-v5cc5\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:48.924920 master-0 kubenswrapper[7642]: I0319 11:57:48.924904 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:48.925032 master-0 kubenswrapper[7642]: I0319 11:57:48.925020 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:48.925132 master-0 kubenswrapper[7642]: I0319 11:57:48.925117 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:48.925233 master-0 kubenswrapper[7642]: I0319 11:57:48.925222 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5p9d\" (UniqueName: \"kubernetes.io/projected/f7efc158-8153-421f-b4a3-c9202212faee-kube-api-access-k5p9d\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:48.925327 master-0 kubenswrapper[7642]: I0319 11:57:48.925310 7642 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-images\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:48.925400 master-0 kubenswrapper[7642]: I0319 11:57:48.925389 7642 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f7efc158-8153-421f-b4a3-c9202212faee-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:48.925461 master-0 kubenswrapper[7642]: I0319 11:57:48.925450 7642 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7efc158-8153-421f-b4a3-c9202212faee-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:48.925528 master-0 kubenswrapper[7642]: I0319 11:57:48.925516 7642 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f7efc158-8153-421f-b4a3-c9202212faee-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:49.026791 master-0 kubenswrapper[7642]: I0319 11:57:49.026727 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.027121 master-0 kubenswrapper[7642]: I0319 11:57:49.027102 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.027226 master-0 kubenswrapper[7642]: I0319 11:57:49.027213 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.027887 master-0 kubenswrapper[7642]: I0319 11:57:49.027870 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5cc5\" (UniqueName: \"kubernetes.io/projected/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-kube-api-access-v5cc5\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.028218 master-0 kubenswrapper[7642]: I0319 11:57:49.027819 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.028318 master-0 kubenswrapper[7642]: I0319 11:57:49.027603 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.030652 master-0 kubenswrapper[7642]: I0319 11:57:49.030597 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.048321 master-0 kubenswrapper[7642]: I0319 11:57:49.048269 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5cc5\" (UniqueName: \"kubernetes.io/projected/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-kube-api-access-v5cc5\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.114116 master-0 kubenswrapper[7642]: I0319 11:57:49.114036 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 11:57:49.556494 master-0 kubenswrapper[7642]: I0319 11:57:49.555390 7642 generic.go:334] "Generic (PLEG): container finished" podID="f7efc158-8153-421f-b4a3-c9202212faee" containerID="e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" exitCode=0 Mar 19 11:57:49.556494 master-0 kubenswrapper[7642]: I0319 11:57:49.555442 7642 generic.go:334] "Generic (PLEG): container finished" podID="f7efc158-8153-421f-b4a3-c9202212faee" containerID="b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" exitCode=0 Mar 19 11:57:49.556494 master-0 kubenswrapper[7642]: I0319 11:57:49.555453 7642 generic.go:334] "Generic (PLEG): container finished" podID="f7efc158-8153-421f-b4a3-c9202212faee" containerID="75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" exitCode=0 Mar 19 11:57:49.556494 master-0 kubenswrapper[7642]: I0319 11:57:49.555981 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" Mar 19 11:57:49.557737 master-0 kubenswrapper[7642]: I0319 11:57:49.557483 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerDied","Data":"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715"} Mar 19 11:57:49.557737 master-0 kubenswrapper[7642]: I0319 11:57:49.557539 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerDied","Data":"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a"} Mar 19 11:57:49.557737 master-0 kubenswrapper[7642]: I0319 11:57:49.557557 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerDied","Data":"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b"} Mar 19 11:57:49.557737 master-0 kubenswrapper[7642]: I0319 11:57:49.557572 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd" event={"ID":"f7efc158-8153-421f-b4a3-c9202212faee","Type":"ContainerDied","Data":"f16fa2b1bd6447c738ceed72fb3f88e2811bff9396e5490d532024732e61132d"} Mar 19 11:57:49.557737 master-0 kubenswrapper[7642]: I0319 11:57:49.557592 7642 scope.go:117] "RemoveContainer" containerID="e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" Mar 19 11:57:49.564122 master-0 kubenswrapper[7642]: I0319 11:57:49.564014 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"ac74b38fd11c535ddb03b556aacc67450651b8de55a732a49658d786ebe0a734"} Mar 19 11:57:49.564122 master-0 kubenswrapper[7642]: I0319 11:57:49.564119 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"4f20fa38da6728fdc70e1b66426e311b675e090aa8e87afaac1c37a5210cd1c8"} Mar 19 11:57:49.579023 master-0 kubenswrapper[7642]: I0319 11:57:49.578844 7642 scope.go:117] "RemoveContainer" containerID="b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" Mar 19 11:57:49.595902 master-0 kubenswrapper[7642]: I0319 11:57:49.595011 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd"] Mar 19 11:57:49.598493 master-0 kubenswrapper[7642]: I0319 11:57:49.598445 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7559f7c68c-5mjdd"] Mar 19 11:57:49.613509 master-0 kubenswrapper[7642]: I0319 11:57:49.613460 7642 scope.go:117] "RemoveContainer" containerID="75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" Mar 19 11:57:49.634073 master-0 kubenswrapper[7642]: I0319 11:57:49.633537 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2"] Mar 19 11:57:49.636234 master-0 kubenswrapper[7642]: I0319 11:57:49.636202 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.639502 master-0 kubenswrapper[7642]: I0319 11:57:49.639441 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 11:57:49.639618 master-0 kubenswrapper[7642]: I0319 11:57:49.639505 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-8zr85" Mar 19 11:57:49.639714 master-0 kubenswrapper[7642]: I0319 11:57:49.639661 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 11:57:49.639857 master-0 kubenswrapper[7642]: I0319 11:57:49.639826 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 11:57:49.639912 master-0 kubenswrapper[7642]: I0319 11:57:49.639454 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 11:57:49.642627 master-0 kubenswrapper[7642]: I0319 11:57:49.642591 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 11:57:49.659536 master-0 kubenswrapper[7642]: I0319 11:57:49.659489 7642 scope.go:117] "RemoveContainer" containerID="e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" Mar 19 11:57:49.661918 master-0 kubenswrapper[7642]: E0319 11:57:49.661882 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": container with ID starting with e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715 not found: ID does not exist" containerID="e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" Mar 19 11:57:49.662024 master-0 kubenswrapper[7642]: I0319 11:57:49.661924 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715"} err="failed to get container status \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": rpc error: code = NotFound desc = could not find container \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": container with ID starting with e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715 not found: ID does not exist" Mar 19 11:57:49.662024 master-0 kubenswrapper[7642]: I0319 11:57:49.661951 7642 scope.go:117] "RemoveContainer" containerID="b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" Mar 19 11:57:49.662706 master-0 kubenswrapper[7642]: E0319 11:57:49.662656 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": container with ID starting with b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a not found: ID does not exist" containerID="b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" Mar 19 11:57:49.662706 master-0 kubenswrapper[7642]: I0319 11:57:49.662697 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a"} err="failed to get container status \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": rpc error: code = NotFound desc = could not find container \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": container with ID starting with b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a not found: ID does not exist" Mar 19 11:57:49.662840 master-0 kubenswrapper[7642]: I0319 11:57:49.662719 7642 scope.go:117] "RemoveContainer" containerID="75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" Mar 19 11:57:49.663613 master-0 kubenswrapper[7642]: E0319 11:57:49.663581 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": container with ID starting with 75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b not found: ID does not exist" containerID="75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" Mar 19 11:57:49.665555 master-0 kubenswrapper[7642]: I0319 11:57:49.663607 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b"} err="failed to get container status \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": rpc error: code = NotFound desc = could not find container \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": container with ID starting with 75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b not found: ID does not exist" Mar 19 11:57:49.665555 master-0 kubenswrapper[7642]: I0319 11:57:49.663626 7642 scope.go:117] "RemoveContainer" containerID="e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" Mar 19 11:57:49.665555 master-0 kubenswrapper[7642]: I0319 11:57:49.664203 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715"} err="failed to get container status \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": rpc error: code = NotFound desc = could not find container \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": container with ID starting with e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715 not found: ID does not exist" Mar 19 11:57:49.665555 master-0 kubenswrapper[7642]: I0319 11:57:49.664274 7642 scope.go:117] "RemoveContainer" containerID="b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" Mar 19 11:57:49.665555 master-0 kubenswrapper[7642]: I0319 11:57:49.664668 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a"} err="failed to get container status \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": rpc error: code = NotFound desc = could not find container \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": container with ID starting with b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a not found: ID does not exist" Mar 19 11:57:49.665555 master-0 kubenswrapper[7642]: I0319 11:57:49.664689 7642 scope.go:117] "RemoveContainer" containerID="75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" Mar 19 11:57:49.665927 master-0 kubenswrapper[7642]: I0319 11:57:49.665868 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b"} err="failed to get container status \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": rpc error: code = NotFound desc = could not find container \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": container with ID starting with 75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b not found: ID does not exist" Mar 19 11:57:49.665988 master-0 kubenswrapper[7642]: I0319 11:57:49.665933 7642 scope.go:117] "RemoveContainer" containerID="e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715" Mar 19 11:57:49.666400 master-0 kubenswrapper[7642]: I0319 11:57:49.666345 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715"} err="failed to get container status \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": rpc error: code = NotFound desc = could not find container \"e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715\": container with ID starting with e60ca8f3393f45e23abc15521d51261b158fab4679e91a2e6f45b4f1777b3715 not found: ID does not exist" Mar 19 11:57:49.666400 master-0 kubenswrapper[7642]: I0319 11:57:49.666374 7642 scope.go:117] "RemoveContainer" containerID="b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a" Mar 19 11:57:49.666715 master-0 kubenswrapper[7642]: I0319 11:57:49.666601 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a"} err="failed to get container status \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": rpc error: code = NotFound desc = could not find container \"b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a\": container with ID starting with b9d1d1584db52e81f23922a945d38efc7cada6cbbe14c250815260e3ee12c78a not found: ID does not exist" Mar 19 11:57:49.666715 master-0 kubenswrapper[7642]: I0319 11:57:49.666625 7642 scope.go:117] "RemoveContainer" containerID="75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b" Mar 19 11:57:49.667872 master-0 kubenswrapper[7642]: I0319 11:57:49.667824 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b"} err="failed to get container status \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": rpc error: code = NotFound desc = could not find container \"75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b\": container with ID starting with 75abaa729664230ee4026fd51eb43c33bfdbc9e99759b4842c563721034d8f2b not found: ID does not exist" Mar 19 11:57:49.743737 master-0 kubenswrapper[7642]: I0319 11:57:49.743373 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.743737 master-0 kubenswrapper[7642]: I0319 11:57:49.743461 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-kube-api-access-t85jj\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.743737 master-0 kubenswrapper[7642]: I0319 11:57:49.743495 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.743737 master-0 kubenswrapper[7642]: I0319 11:57:49.743516 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.743737 master-0 kubenswrapper[7642]: I0319 11:57:49.743540 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.845849 master-0 kubenswrapper[7642]: I0319 11:57:49.845573 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.846108 master-0 kubenswrapper[7642]: I0319 11:57:49.845902 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-kube-api-access-t85jj\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.846108 master-0 kubenswrapper[7642]: I0319 11:57:49.846092 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.846219 master-0 kubenswrapper[7642]: I0319 11:57:49.846132 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.846219 master-0 kubenswrapper[7642]: I0319 11:57:49.846176 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.846950 master-0 kubenswrapper[7642]: I0319 11:57:49.846867 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.847105 master-0 kubenswrapper[7642]: I0319 11:57:49.847053 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.847362 master-0 kubenswrapper[7642]: I0319 11:57:49.847314 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.849132 master-0 kubenswrapper[7642]: I0319 11:57:49.849075 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.867663 master-0 kubenswrapper[7642]: I0319 11:57:49.867604 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-kube-api-access-t85jj\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.966649 master-0 kubenswrapper[7642]: I0319 11:57:49.966572 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 11:57:49.990672 master-0 kubenswrapper[7642]: W0319 11:57:49.990597 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode47d3935_d96e_4fd2_abfc_99bff7eac5e0.slice/crio-67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327 WatchSource:0}: Error finding container 67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327: Status 404 returned error can't find the container with id 67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327 Mar 19 11:57:50.080339 master-0 kubenswrapper[7642]: I0319 11:57:50.080272 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920d5a32-1dd2-4d45-8319-1aca1ffd48b5" path="/var/lib/kubelet/pods/920d5a32-1dd2-4d45-8319-1aca1ffd48b5/volumes" Mar 19 11:57:50.080933 master-0 kubenswrapper[7642]: I0319 11:57:50.080907 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7efc158-8153-421f-b4a3-c9202212faee" path="/var/lib/kubelet/pods/f7efc158-8153-421f-b4a3-c9202212faee/volumes" Mar 19 11:57:50.477192 master-0 kubenswrapper[7642]: I0319 11:57:50.477093 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 11:57:50.477474 master-0 kubenswrapper[7642]: I0319 11:57:50.477438 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" containerID="cri-o://1dadbe52e08f9953877bf3adbfada05dc57d5eba9ca59d9f0aac897d40194638" gracePeriod=30 Mar 19 11:57:50.478178 master-0 kubenswrapper[7642]: I0319 11:57:50.477602 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" containerID="cri-o://78b8de87e8a5c898566cfde9569af5051ed935aae3421d721d98339159c91a61" gracePeriod=30 Mar 19 11:57:50.478654 master-0 kubenswrapper[7642]: I0319 11:57:50.478606 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 11:57:50.479399 master-0 kubenswrapper[7642]: E0319 11:57:50.479340 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479399 master-0 kubenswrapper[7642]: I0319 11:57:50.479361 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479399 master-0 kubenswrapper[7642]: E0319 11:57:50.479378 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 11:57:50.479399 master-0 kubenswrapper[7642]: I0319 11:57:50.479386 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 11:57:50.479731 master-0 kubenswrapper[7642]: E0319 11:57:50.479423 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479731 master-0 kubenswrapper[7642]: I0319 11:57:50.479432 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479731 master-0 kubenswrapper[7642]: I0319 11:57:50.479562 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479731 master-0 kubenswrapper[7642]: I0319 11:57:50.479587 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479731 master-0 kubenswrapper[7642]: I0319 11:57:50.479605 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="cluster-policy-controller" Mar 19 11:57:50.479980 master-0 kubenswrapper[7642]: E0319 11:57:50.479755 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479980 master-0 kubenswrapper[7642]: I0319 11:57:50.479767 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.479980 master-0 kubenswrapper[7642]: I0319 11:57:50.479885 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f265536aba6292ead501bc9b49f327" containerName="kube-controller-manager" Mar 19 11:57:50.480821 master-0 kubenswrapper[7642]: I0319 11:57:50.480794 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.538580 master-0 kubenswrapper[7642]: I0319 11:57:50.538382 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 11:57:50.563565 master-0 kubenswrapper[7642]: I0319 11:57:50.563493 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.564078 master-0 kubenswrapper[7642]: I0319 11:57:50.563569 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.584992 master-0 kubenswrapper[7642]: I0319 11:57:50.584913 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"7fc5e15dbb622b807625a944b8961634a70f1aecfe4e4c996f59e0fe79999af9"} Mar 19 11:57:50.588949 master-0 kubenswrapper[7642]: I0319 11:57:50.588873 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"1807478431eec6d4ebe9e175a729aeaa97be5330ccb6e1fb9986d071c23571b4"} Mar 19 11:57:50.589046 master-0 kubenswrapper[7642]: I0319 11:57:50.588969 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"79d5dba168afeffdccc6ff9b4390e9029f185749a05da6ead57ad3001b8c3183"} Mar 19 11:57:50.589046 master-0 kubenswrapper[7642]: I0319 11:57:50.588990 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327"} Mar 19 11:57:50.608256 master-0 kubenswrapper[7642]: I0319 11:57:50.608176 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" podStartSLOduration=2.608153979 podStartE2EDuration="2.608153979s" podCreationTimestamp="2026-03-19 11:57:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:50.602184024 +0000 UTC m=+228.719624811" watchObservedRunningTime="2026-03-19 11:57:50.608153979 +0000 UTC m=+228.725594766" Mar 19 11:57:50.662280 master-0 kubenswrapper[7642]: I0319 11:57:50.662207 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:50.666420 master-0 kubenswrapper[7642]: I0319 11:57:50.666078 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.666420 master-0 kubenswrapper[7642]: I0319 11:57:50.666191 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.666420 master-0 kubenswrapper[7642]: I0319 11:57:50.666321 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.666660 master-0 kubenswrapper[7642]: I0319 11:57:50.666501 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.689081 master-0 kubenswrapper[7642]: I0319 11:57:50.689014 7642 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="79ea4078-814a-4ed7-820f-d18016fefb9b" Mar 19 11:57:50.767911 master-0 kubenswrapper[7642]: I0319 11:57:50.767815 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 11:57:50.767911 master-0 kubenswrapper[7642]: I0319 11:57:50.767904 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 11:57:50.767911 master-0 kubenswrapper[7642]: I0319 11:57:50.767923 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 11:57:50.768250 master-0 kubenswrapper[7642]: I0319 11:57:50.768007 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 11:57:50.768250 master-0 kubenswrapper[7642]: I0319 11:57:50.768056 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") pod \"46f265536aba6292ead501bc9b49f327\" (UID: \"46f265536aba6292ead501bc9b49f327\") " Mar 19 11:57:50.768436 master-0 kubenswrapper[7642]: I0319 11:57:50.768414 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets" (OuterVolumeSpecName: "secrets") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:50.768491 master-0 kubenswrapper[7642]: I0319 11:57:50.768457 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:50.768491 master-0 kubenswrapper[7642]: I0319 11:57:50.768480 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs" (OuterVolumeSpecName: "logs") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:50.768584 master-0 kubenswrapper[7642]: I0319 11:57:50.768496 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config" (OuterVolumeSpecName: "config") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:50.768584 master-0 kubenswrapper[7642]: I0319 11:57:50.768518 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "46f265536aba6292ead501bc9b49f327" (UID: "46f265536aba6292ead501bc9b49f327"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:50.835368 master-0 kubenswrapper[7642]: I0319 11:57:50.835309 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:57:50.857102 master-0 kubenswrapper[7642]: W0319 11:57:50.857039 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed5fb754bce5477588bd28f79b7bee7.slice/crio-7146690142f28d4686e2cc67fa293c2eaa3b3b7192da93637dc526fe5b56f361 WatchSource:0}: Error finding container 7146690142f28d4686e2cc67fa293c2eaa3b3b7192da93637dc526fe5b56f361: Status 404 returned error can't find the container with id 7146690142f28d4686e2cc67fa293c2eaa3b3b7192da93637dc526fe5b56f361 Mar 19 11:57:50.870570 master-0 kubenswrapper[7642]: I0319 11:57:50.870503 7642 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:50.870570 master-0 kubenswrapper[7642]: I0319 11:57:50.870562 7642 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:50.870767 master-0 kubenswrapper[7642]: I0319 11:57:50.870576 7642 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:50.870767 master-0 kubenswrapper[7642]: I0319 11:57:50.870594 7642 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:50.870767 master-0 kubenswrapper[7642]: I0319 11:57:50.870608 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/46f265536aba6292ead501bc9b49f327-config\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:51.608497 master-0 kubenswrapper[7642]: I0319 11:57:51.608315 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerStarted","Data":"9b8698d443a661dc8c25e79e56fd0a5be94da1dd2f3e5548ce87e282a4e8855f"} Mar 19 11:57:51.608497 master-0 kubenswrapper[7642]: I0319 11:57:51.608391 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerStarted","Data":"7146690142f28d4686e2cc67fa293c2eaa3b3b7192da93637dc526fe5b56f361"} Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.609925 7642 generic.go:334] "Generic (PLEG): container finished" podID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerID="882b540d45f96873ea08eb024547ab884cf7ba3eaac3163e6d0d3525a76f6dd5" exitCode=0 Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.610007 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec","Type":"ContainerDied","Data":"882b540d45f96873ea08eb024547ab884cf7ba3eaac3163e6d0d3525a76f6dd5"} Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.613754 7642 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="78b8de87e8a5c898566cfde9569af5051ed935aae3421d721d98339159c91a61" exitCode=0 Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.613802 7642 generic.go:334] "Generic (PLEG): container finished" podID="46f265536aba6292ead501bc9b49f327" containerID="1dadbe52e08f9953877bf3adbfada05dc57d5eba9ca59d9f0aac897d40194638" exitCode=0 Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.613911 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e2ac089de599dfd6512e3d47cbeeed8ac50e52fa04720c3256b653ca97af81a" Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.613924 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 19 11:57:51.616186 master-0 kubenswrapper[7642]: I0319 11:57:51.613940 7642 scope.go:117] "RemoveContainer" containerID="0e092e68e51dc8d746441712e9315d094b81fce2d278dfd7bbb413252f403787" Mar 19 11:57:51.618538 master-0 kubenswrapper[7642]: I0319 11:57:51.618479 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"35e09c0adb6fc12f6eca598398ddbcf71d34e7200dffbfb6bdf2fe668ebaf179"} Mar 19 11:57:52.075969 master-0 kubenswrapper[7642]: I0319 11:57:52.075902 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f265536aba6292ead501bc9b49f327" path="/var/lib/kubelet/pods/46f265536aba6292ead501bc9b49f327/volumes" Mar 19 11:57:52.076501 master-0 kubenswrapper[7642]: I0319 11:57:52.076263 7642 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 19 11:57:52.825372 master-0 kubenswrapper[7642]: I0319 11:57:52.825312 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerStarted","Data":"1d8090aa91142bfe2d9ec6028d840d4af95fbe15328004054e0a0a8e1a71dfd6"} Mar 19 11:57:52.825372 master-0 kubenswrapper[7642]: I0319 11:57:52.825372 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerStarted","Data":"203765ba18c88a4bb80613cce48098f24f5378802947c4073ee2b353a34707ac"} Mar 19 11:57:52.825936 master-0 kubenswrapper[7642]: I0319 11:57:52.825392 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 11:57:52.825936 master-0 kubenswrapper[7642]: I0319 11:57:52.825411 7642 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="79ea4078-814a-4ed7-820f-d18016fefb9b" Mar 19 11:57:52.832290 master-0 kubenswrapper[7642]: I0319 11:57:52.832047 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 19 11:57:52.832290 master-0 kubenswrapper[7642]: I0319 11:57:52.832092 7642 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="79ea4078-814a-4ed7-820f-d18016fefb9b" Mar 19 11:57:52.967561 master-0 kubenswrapper[7642]: I0319 11:57:52.967514 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:52.984879 master-0 kubenswrapper[7642]: I0319 11:57:52.984766 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" podStartSLOduration=3.984745046 podStartE2EDuration="3.984745046s" podCreationTimestamp="2026-03-19 11:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:52.856710521 +0000 UTC m=+230.974151318" watchObservedRunningTime="2026-03-19 11:57:52.984745046 +0000 UTC m=+231.102185833" Mar 19 11:57:53.123485 master-0 kubenswrapper[7642]: I0319 11:57:53.123423 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-var-lock\") pod \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " Mar 19 11:57:53.123485 master-0 kubenswrapper[7642]: I0319 11:57:53.123499 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kube-api-access\") pod \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " Mar 19 11:57:53.123811 master-0 kubenswrapper[7642]: I0319 11:57:53.123527 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kubelet-dir\") pod \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\" (UID: \"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec\") " Mar 19 11:57:53.123811 master-0 kubenswrapper[7642]: I0319 11:57:53.123557 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-var-lock" (OuterVolumeSpecName: "var-lock") pod "18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" (UID: "18c0e2ac-b5c3-447d-96b4-37d7d0d733ec"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:53.123811 master-0 kubenswrapper[7642]: I0319 11:57:53.123776 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:53.123811 master-0 kubenswrapper[7642]: I0319 11:57:53.123806 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" (UID: "18c0e2ac-b5c3-447d-96b4-37d7d0d733ec"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:57:53.126972 master-0 kubenswrapper[7642]: I0319 11:57:53.126916 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" (UID: "18c0e2ac-b5c3-447d-96b4-37d7d0d733ec"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:53.225852 master-0 kubenswrapper[7642]: I0319 11:57:53.225712 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:53.225852 master-0 kubenswrapper[7642]: I0319 11:57:53.225801 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c0e2ac-b5c3-447d-96b4-37d7d0d733ec-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:53.653621 master-0 kubenswrapper[7642]: I0319 11:57:53.653555 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerStarted","Data":"c67f49e1211ff300919ba996552c5270a8c7964fe689dec11a1c7473a2f29c42"} Mar 19 11:57:53.655961 master-0 kubenswrapper[7642]: I0319 11:57:53.655935 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec","Type":"ContainerDied","Data":"23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561"} Mar 19 11:57:53.655961 master-0 kubenswrapper[7642]: I0319 11:57:53.655960 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561" Mar 19 11:57:53.656224 master-0 kubenswrapper[7642]: I0319 11:57:53.656194 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 11:57:53.696666 master-0 kubenswrapper[7642]: I0319 11:57:53.691525 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=3.691497513 podStartE2EDuration="3.691497513s" podCreationTimestamp="2026-03-19 11:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:57:53.689771562 +0000 UTC m=+231.807212359" watchObservedRunningTime="2026-03-19 11:57:53.691497513 +0000 UTC m=+231.808938310" Mar 19 11:57:54.670012 master-0 kubenswrapper[7642]: I0319 11:57:54.669933 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-z7wgn_5ae2d717-506b-4d8e-803b-689c1cc3557c/multus-admission-controller/0.log" Mar 19 11:57:54.670592 master-0 kubenswrapper[7642]: I0319 11:57:54.670017 7642 generic.go:334] "Generic (PLEG): container finished" podID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerID="7df327773d76aaacf805979906c8e4aa2f22ce6b1887fb5d42df0fd26e17a8b7" exitCode=137 Mar 19 11:57:54.670791 master-0 kubenswrapper[7642]: I0319 11:57:54.670718 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" event={"ID":"5ae2d717-506b-4d8e-803b-689c1cc3557c","Type":"ContainerDied","Data":"7df327773d76aaacf805979906c8e4aa2f22ce6b1887fb5d42df0fd26e17a8b7"} Mar 19 11:57:54.670912 master-0 kubenswrapper[7642]: I0319 11:57:54.670894 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" event={"ID":"5ae2d717-506b-4d8e-803b-689c1cc3557c","Type":"ContainerDied","Data":"6e6934bb1122fb83174516e208b7a3ad1b4497aff55eda2c80c1a4c886b31150"} Mar 19 11:57:54.670984 master-0 kubenswrapper[7642]: I0319 11:57:54.670971 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6934bb1122fb83174516e208b7a3ad1b4497aff55eda2c80c1a4c886b31150" Mar 19 11:57:54.696666 master-0 kubenswrapper[7642]: I0319 11:57:54.694776 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5dbbb8b86f-z7wgn_5ae2d717-506b-4d8e-803b-689c1cc3557c/multus-admission-controller/0.log" Mar 19 11:57:54.696666 master-0 kubenswrapper[7642]: I0319 11:57:54.694862 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:57:54.848002 master-0 kubenswrapper[7642]: I0319 11:57:54.847908 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") pod \"5ae2d717-506b-4d8e-803b-689c1cc3557c\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " Mar 19 11:57:54.848002 master-0 kubenswrapper[7642]: I0319 11:57:54.848012 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") pod \"5ae2d717-506b-4d8e-803b-689c1cc3557c\" (UID: \"5ae2d717-506b-4d8e-803b-689c1cc3557c\") " Mar 19 11:57:54.853590 master-0 kubenswrapper[7642]: I0319 11:57:54.852321 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "5ae2d717-506b-4d8e-803b-689c1cc3557c" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:57:54.853590 master-0 kubenswrapper[7642]: I0319 11:57:54.852560 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q" (OuterVolumeSpecName: "kube-api-access-s2v9q") pod "5ae2d717-506b-4d8e-803b-689c1cc3557c" (UID: "5ae2d717-506b-4d8e-803b-689c1cc3557c"). InnerVolumeSpecName "kube-api-access-s2v9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:57:54.950339 master-0 kubenswrapper[7642]: I0319 11:57:54.950165 7642 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5ae2d717-506b-4d8e-803b-689c1cc3557c-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:54.950339 master-0 kubenswrapper[7642]: I0319 11:57:54.950221 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2v9q\" (UniqueName: \"kubernetes.io/projected/5ae2d717-506b-4d8e-803b-689c1cc3557c-kube-api-access-s2v9q\") on node \"master-0\" DevicePath \"\"" Mar 19 11:57:55.343710 master-0 kubenswrapper[7642]: I0319 11:57:55.343525 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-vt498_162b513f-2f67-4946-816b-48f6232dfca0/authentication-operator/0.log" Mar 19 11:57:55.541980 master-0 kubenswrapper[7642]: I0319 11:57:55.541934 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-vt498_162b513f-2f67-4946-816b-48f6232dfca0/authentication-operator/1.log" Mar 19 11:57:55.674817 master-0 kubenswrapper[7642]: I0319 11:57:55.674704 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn" Mar 19 11:57:55.723923 master-0 kubenswrapper[7642]: I0319 11:57:55.722989 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn"] Mar 19 11:57:55.725376 master-0 kubenswrapper[7642]: I0319 11:57:55.725348 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-5dbbb8b86f-z7wgn"] Mar 19 11:57:56.085915 master-0 kubenswrapper[7642]: I0319 11:57:56.085718 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" path="/var/lib/kubelet/pods/5ae2d717-506b-4d8e-803b-689c1cc3557c/volumes" Mar 19 11:57:56.141036 master-0 kubenswrapper[7642]: I0319 11:57:56.140950 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5c7d7c6bb8-gnz2z_66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/oauth-apiserver/0.log" Mar 19 11:57:56.335670 master-0 kubenswrapper[7642]: I0319 11:57:56.335557 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5c7d7c6bb8-gnz2z_66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/fix-audit-permissions/0.log" Mar 19 11:57:56.545055 master-0 kubenswrapper[7642]: I0319 11:57:56.544987 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5c7d7c6bb8-gnz2z_66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/oauth-apiserver/1.log" Mar 19 11:57:56.740931 master-0 kubenswrapper[7642]: I0319 11:57:56.740857 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-bj75g_aab47a39-a18b-4350-b141-4d5de2d8e96f/etcd-operator/0.log" Mar 19 11:57:57.085090 master-0 kubenswrapper[7642]: I0319 11:57:57.084959 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-bj75g_aab47a39-a18b-4350-b141-4d5de2d8e96f/etcd-operator/1.log" Mar 19 11:57:57.140420 master-0 kubenswrapper[7642]: I0319 11:57:57.140352 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_01529a13-e1b1-40a1-9fff-8a20f1cba68c/installer/0.log" Mar 19 11:57:57.342319 master-0 kubenswrapper[7642]: I0319 11:57:57.342147 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-4tqrh_367ea4e8-8172-4730-8f26-499ed7c167bb/kube-apiserver-operator/0.log" Mar 19 11:57:57.536945 master-0 kubenswrapper[7642]: I0319 11:57:57.536873 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-8b68b9d9b-4tqrh_367ea4e8-8172-4730-8f26-499ed7c167bb/kube-apiserver-operator/1.log" Mar 19 11:57:57.737010 master-0 kubenswrapper[7642]: I0319 11:57:57.736939 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/setup/0.log" Mar 19 11:57:57.945529 master-0 kubenswrapper[7642]: I0319 11:57:57.945470 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/kube-apiserver/0.log" Mar 19 11:57:58.135289 master-0 kubenswrapper[7642]: I0319 11:57:58.135092 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_49fac1b46a11e49501805e891baae4a9/kube-apiserver-insecure-readyz/0.log" Mar 19 11:57:58.337774 master-0 kubenswrapper[7642]: I0319 11:57:58.337700 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_eb4c5b33-58c8-4b51-b71b-1befc165f2b1/installer/0.log" Mar 19 11:57:58.537938 master-0 kubenswrapper[7642]: I0319 11:57:58.537818 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_de6765b7-3576-4171-972b-20d62d81f25d/installer/0.log" Mar 19 11:57:58.740813 master-0 kubenswrapper[7642]: I0319 11:57:58.740751 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_18c0e2ac-b5c3-447d-96b4-37d7d0d733ec/installer/0.log" Mar 19 11:57:58.935610 master-0 kubenswrapper[7642]: I0319 11:57:58.935550 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager/0.log" Mar 19 11:57:59.138909 master-0 kubenswrapper[7642]: I0319 11:57:59.138845 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/cluster-policy-controller/0.log" Mar 19 11:57:59.336749 master-0 kubenswrapper[7642]: I0319 11:57:59.336596 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager-cert-syncer/0.log" Mar 19 11:57:59.535121 master-0 kubenswrapper[7642]: I0319 11:57:59.535058 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager-recovery-controller/0.log" Mar 19 11:57:59.737212 master-0 kubenswrapper[7642]: I0319 11:57:59.737123 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-8xkv5_ac946049-4b93-4d8c-bfa6-eacf83160ed1/kube-controller-manager-operator/1.log" Mar 19 11:57:59.939507 master-0 kubenswrapper[7642]: I0319 11:57:59.939425 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-8xkv5_ac946049-4b93-4d8c-bfa6-eacf83160ed1/kube-controller-manager-operator/2.log" Mar 19 11:58:00.836900 master-0 kubenswrapper[7642]: I0319 11:58:00.836793 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:00.836900 master-0 kubenswrapper[7642]: I0319 11:58:00.836898 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:00.836900 master-0 kubenswrapper[7642]: I0319 11:58:00.836920 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:00.837710 master-0 kubenswrapper[7642]: I0319 11:58:00.836938 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:00.837710 master-0 kubenswrapper[7642]: I0319 11:58:00.837343 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 11:58:00.837710 master-0 kubenswrapper[7642]: I0319 11:58:00.837449 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 11:58:00.841163 master-0 kubenswrapper[7642]: I0319 11:58:00.841119 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:00.939329 master-0 kubenswrapper[7642]: I0319 11:58:00.939256 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_c0d8ad30-feb0-453f-b652-36aafc7b24f9/installer/0.log" Mar 19 11:58:01.137123 master-0 kubenswrapper[7642]: I0319 11:58:01.136931 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/wait-for-host-port/0.log" Mar 19 11:58:01.341307 master-0 kubenswrapper[7642]: I0319 11:58:01.341073 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler/0.log" Mar 19 11:58:01.537672 master-0 kubenswrapper[7642]: I0319 11:58:01.537584 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 11:58:01.720795 master-0 kubenswrapper[7642]: I0319 11:58:01.720687 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:01.738851 master-0 kubenswrapper[7642]: I0319 11:58:01.738362 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-recovery-controller/0.log" Mar 19 11:58:01.939689 master-0 kubenswrapper[7642]: I0319 11:58:01.939641 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-hgwgj_bea42722-d7af-4938-9f3a-da57bd056f96/kube-scheduler-operator-container/0.log" Mar 19 11:58:02.137128 master-0 kubenswrapper[7642]: I0319 11:58:02.137074 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-dddff6458-hgwgj_bea42722-d7af-4938-9f3a-da57bd056f96/kube-scheduler-operator-container/1.log" Mar 19 11:58:03.053160 master-0 kubenswrapper[7642]: I0319 11:58:03.053104 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-ptjr9_84e5577e-df8b-46a2-aff7-0bf90b5010d9/openshift-apiserver-operator/1.log" Mar 19 11:58:03.440034 master-0 kubenswrapper[7642]: I0319 11:58:03.439990 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-ptjr9_84e5577e-df8b-46a2-aff7-0bf90b5010d9/openshift-apiserver-operator/2.log" Mar 19 11:58:03.452316 master-0 kubenswrapper[7642]: I0319 11:58:03.452278 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-66dcf44d9d-46w4r_7af28c70-3ee0-438b-8655-95bc57deaa05/fix-audit-permissions/0.log" Mar 19 11:58:03.461738 master-0 kubenswrapper[7642]: I0319 11:58:03.461703 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-66dcf44d9d-46w4r_7af28c70-3ee0-438b-8655-95bc57deaa05/openshift-apiserver/0.log" Mar 19 11:58:03.471273 master-0 kubenswrapper[7642]: I0319 11:58:03.471227 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-66dcf44d9d-46w4r_7af28c70-3ee0-438b-8655-95bc57deaa05/openshift-apiserver-check-endpoints/0.log" Mar 19 11:58:03.480191 master-0 kubenswrapper[7642]: I0319 11:58:03.480136 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-bj75g_aab47a39-a18b-4350-b141-4d5de2d8e96f/etcd-operator/0.log" Mar 19 11:58:03.736301 master-0 kubenswrapper[7642]: I0319 11:58:03.736171 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-8544cbcf9c-bj75g_aab47a39-a18b-4350-b141-4d5de2d8e96f/etcd-operator/1.log" Mar 19 11:58:03.752485 master-0 kubenswrapper[7642]: I0319 11:58:03.752429 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-68f85b4d6c-qldxf_6e8c4676-05af-4478-a44b-1bdcbefdea0f/catalog-operator/0.log" Mar 19 11:58:03.938094 master-0 kubenswrapper[7642]: I0319 11:58:03.938052 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5c9796789-jsp5b_5b46c6b4-f701-4a7f-bf74-90afe14ad89a/olm-operator/0.log" Mar 19 11:58:04.249143 master-0 kubenswrapper[7642]: I0319 11:58:04.249079 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 11:58:04.483481 master-0 kubenswrapper[7642]: I0319 11:58:04.483412 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-ftz5b_94fa5656-0561-45ab-9ab1-a6b96b8a47d4/kube-rbac-proxy/0.log" Mar 19 11:58:05.439405 master-0 kubenswrapper[7642]: I0319 11:58:05.439344 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-ftz5b_94fa5656-0561-45ab-9ab1-a6b96b8a47d4/package-server-manager/0.log" Mar 19 11:58:05.823774 master-0 kubenswrapper[7642]: I0319 11:58:05.823515 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-86c8b5dddf-j95tg_b69a314d-4884-4f2c-a48e-1cf68c5aaef4/packageserver/0.log" Mar 19 11:58:10.837052 master-0 kubenswrapper[7642]: I0319 11:58:10.836906 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 11:58:10.837052 master-0 kubenswrapper[7642]: I0319 11:58:10.837041 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 11:58:20.837276 master-0 kubenswrapper[7642]: I0319 11:58:20.837164 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 19 11:58:20.837276 master-0 kubenswrapper[7642]: I0319 11:58:20.837233 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 19 11:58:20.838488 master-0 kubenswrapper[7642]: I0319 11:58:20.837300 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:58:20.838488 master-0 kubenswrapper[7642]: I0319 11:58:20.838085 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"9b8698d443a661dc8c25e79e56fd0a5be94da1dd2f3e5548ce87e282a4e8855f"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 11:58:20.838488 master-0 kubenswrapper[7642]: I0319 11:58:20.838219 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" containerID="cri-o://9b8698d443a661dc8c25e79e56fd0a5be94da1dd2f3e5548ce87e282a4e8855f" gracePeriod=30 Mar 19 11:58:22.098542 master-0 kubenswrapper[7642]: I0319 11:58:22.098446 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 11:58:32.107285 master-0 kubenswrapper[7642]: I0319 11:58:32.107180 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=10.107158019 podStartE2EDuration="10.107158019s" podCreationTimestamp="2026-03-19 11:58:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:58:32.104018627 +0000 UTC m=+270.221459444" watchObservedRunningTime="2026-03-19 11:58:32.107158019 +0000 UTC m=+270.224598806" Mar 19 11:58:51.033431 master-0 kubenswrapper[7642]: I0319 11:58:51.033380 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager/0.log" Mar 19 11:58:51.033431 master-0 kubenswrapper[7642]: I0319 11:58:51.033434 7642 generic.go:334] "Generic (PLEG): container finished" podID="2ed5fb754bce5477588bd28f79b7bee7" containerID="9b8698d443a661dc8c25e79e56fd0a5be94da1dd2f3e5548ce87e282a4e8855f" exitCode=137 Mar 19 11:58:51.034129 master-0 kubenswrapper[7642]: I0319 11:58:51.033464 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerDied","Data":"9b8698d443a661dc8c25e79e56fd0a5be94da1dd2f3e5548ce87e282a4e8855f"} Mar 19 11:58:52.043297 master-0 kubenswrapper[7642]: I0319 11:58:52.043221 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager/0.log" Mar 19 11:58:52.044092 master-0 kubenswrapper[7642]: I0319 11:58:52.043310 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"2ed5fb754bce5477588bd28f79b7bee7","Type":"ContainerStarted","Data":"0e6c2dae4c28b6bf9d3115747fe713bb797e4eb746fc209c257ace61169519c1"} Mar 19 11:59:00.836336 master-0 kubenswrapper[7642]: I0319 11:59:00.836246 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:59:00.837199 master-0 kubenswrapper[7642]: I0319 11:59:00.836853 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:59:00.841141 master-0 kubenswrapper[7642]: I0319 11:59:00.841082 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:59:01.105710 master-0 kubenswrapper[7642]: I0319 11:59:01.105529 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 11:59:05.125984 master-0 kubenswrapper[7642]: I0319 11:59:05.125872 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/1.log" Mar 19 11:59:05.126789 master-0 kubenswrapper[7642]: I0319 11:59:05.126758 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/0.log" Mar 19 11:59:05.126846 master-0 kubenswrapper[7642]: I0319 11:59:05.126815 7642 generic.go:334] "Generic (PLEG): container finished" podID="f38df702-a256-4f83-90bb-0c0e477a2ce6" containerID="7316631a0f45a2c1ef1105021be26127f5494db22ff552f0eb4f6711db06819f" exitCode=1 Mar 19 11:59:05.126880 master-0 kubenswrapper[7642]: I0319 11:59:05.126851 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerDied","Data":"7316631a0f45a2c1ef1105021be26127f5494db22ff552f0eb4f6711db06819f"} Mar 19 11:59:05.126913 master-0 kubenswrapper[7642]: I0319 11:59:05.126903 7642 scope.go:117] "RemoveContainer" containerID="a33c34d1a46884ff2764e0114249d4b1bcbad80e05aa9b1072bd5b5574041486" Mar 19 11:59:05.127399 master-0 kubenswrapper[7642]: I0319 11:59:05.127359 7642 scope.go:117] "RemoveContainer" containerID="7316631a0f45a2c1ef1105021be26127f5494db22ff552f0eb4f6711db06819f" Mar 19 11:59:05.127612 master-0 kubenswrapper[7642]: E0319 11:59:05.127578 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 11:59:06.134805 master-0 kubenswrapper[7642]: I0319 11:59:06.134709 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/1.log" Mar 19 11:59:06.248760 master-0 kubenswrapper[7642]: I0319 11:59:06.248720 7642 scope.go:117] "RemoveContainer" containerID="1dadbe52e08f9953877bf3adbfada05dc57d5eba9ca59d9f0aac897d40194638" Mar 19 11:59:06.266936 master-0 kubenswrapper[7642]: I0319 11:59:06.266878 7642 scope.go:117] "RemoveContainer" containerID="f2ca0a192f7dfe6f6eb1479534201a334d6844184e4ca66097e40894d0ccce20" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.657529 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j"] Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: E0319 11:59:11.657889 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerName="installer" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.657905 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerName="installer" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: E0319 11:59:11.657922 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="multus-admission-controller" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.657931 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="multus-admission-controller" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: E0319 11:59:11.657945 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="kube-rbac-proxy" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.657952 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="kube-rbac-proxy" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.658064 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerName="installer" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.658081 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="multus-admission-controller" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.658094 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae2d717-506b-4d8e-803b-689c1cc3557c" containerName="kube-rbac-proxy" Mar 19 11:59:11.659293 master-0 kubenswrapper[7642]: I0319 11:59:11.659076 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.665089 master-0 kubenswrapper[7642]: I0319 11:59:11.662511 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 11:59:11.665089 master-0 kubenswrapper[7642]: I0319 11:59:11.662730 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-56gmj" Mar 19 11:59:11.676516 master-0 kubenswrapper[7642]: I0319 11:59:11.676157 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j"] Mar 19 11:59:11.723449 master-0 kubenswrapper[7642]: I0319 11:59:11.722771 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.723449 master-0 kubenswrapper[7642]: I0319 11:59:11.722872 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt2h\" (UniqueName: \"kubernetes.io/projected/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-kube-api-access-pxt2h\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.723449 master-0 kubenswrapper[7642]: I0319 11:59:11.722898 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.823827 master-0 kubenswrapper[7642]: I0319 11:59:11.823748 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.824104 master-0 kubenswrapper[7642]: I0319 11:59:11.824046 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt2h\" (UniqueName: \"kubernetes.io/projected/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-kube-api-access-pxt2h\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.824192 master-0 kubenswrapper[7642]: I0319 11:59:11.824162 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.824879 master-0 kubenswrapper[7642]: I0319 11:59:11.824846 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.827415 master-0 kubenswrapper[7642]: I0319 11:59:11.827371 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.860602 master-0 kubenswrapper[7642]: I0319 11:59:11.860552 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt2h\" (UniqueName: \"kubernetes.io/projected/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-kube-api-access-pxt2h\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:11.984723 master-0 kubenswrapper[7642]: I0319 11:59:11.984552 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 11:59:12.339178 master-0 kubenswrapper[7642]: I0319 11:59:12.338981 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j"] Mar 19 11:59:12.357975 master-0 kubenswrapper[7642]: W0319 11:59:12.357905 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7d0b6f_91bf_4e6c_a090_77705ddc8bf6.slice/crio-ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217 WatchSource:0}: Error finding container ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217: Status 404 returned error can't find the container with id ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217 Mar 19 11:59:12.643722 master-0 kubenswrapper[7642]: I0319 11:59:12.643673 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7dcf5569b5-47tqf"] Mar 19 11:59:12.644616 master-0 kubenswrapper[7642]: I0319 11:59:12.644591 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.647360 master-0 kubenswrapper[7642]: I0319 11:59:12.647322 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp"] Mar 19 11:59:12.648109 master-0 kubenswrapper[7642]: I0319 11:59:12.648083 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 11:59:12.648578 master-0 kubenswrapper[7642]: I0319 11:59:12.648554 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 11:59:12.648766 master-0 kubenswrapper[7642]: I0319 11:59:12.648740 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 11:59:12.648874 master-0 kubenswrapper[7642]: I0319 11:59:12.648855 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 11:59:12.649038 master-0 kubenswrapper[7642]: I0319 11:59:12.648985 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 11:59:12.649038 master-0 kubenswrapper[7642]: I0319 11:59:12.649010 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 11:59:12.649130 master-0 kubenswrapper[7642]: I0319 11:59:12.649096 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 11:59:12.654120 master-0 kubenswrapper[7642]: I0319 11:59:12.654051 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6"] Mar 19 11:59:12.655053 master-0 kubenswrapper[7642]: I0319 11:59:12.655019 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:12.656517 master-0 kubenswrapper[7642]: I0319 11:59:12.656443 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 11:59:12.666527 master-0 kubenswrapper[7642]: I0319 11:59:12.666473 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tzfl9"] Mar 19 11:59:12.667798 master-0 kubenswrapper[7642]: I0319 11:59:12.667765 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.670828 master-0 kubenswrapper[7642]: I0319 11:59:12.670790 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 11:59:12.671078 master-0 kubenswrapper[7642]: I0319 11:59:12.671056 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-kl9xx" Mar 19 11:59:12.672504 master-0 kubenswrapper[7642]: I0319 11:59:12.671319 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 11:59:12.672504 master-0 kubenswrapper[7642]: I0319 11:59:12.671496 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 11:59:12.672504 master-0 kubenswrapper[7642]: I0319 11:59:12.672373 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp"] Mar 19 11:59:12.675430 master-0 kubenswrapper[7642]: I0319 11:59:12.675389 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6"] Mar 19 11:59:12.714704 master-0 kubenswrapper[7642]: I0319 11:59:12.712158 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tzfl9"] Mar 19 11:59:12.736226 master-0 kubenswrapper[7642]: I0319 11:59:12.736154 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-default-certificate\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.736226 master-0 kubenswrapper[7642]: I0319 11:59:12.736235 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.736580 master-0 kubenswrapper[7642]: I0319 11:59:12.736258 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-stats-auth\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.736580 master-0 kubenswrapper[7642]: I0319 11:59:12.736284 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.736580 master-0 kubenswrapper[7642]: I0319 11:59:12.736362 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgsm4\" (UniqueName: \"kubernetes.io/projected/46108693-3c9e-4804-beb0-d6b8f8df7625-kube-api-access-wgsm4\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.837570 master-0 kubenswrapper[7642]: I0319 11:59:12.837511 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.837811 master-0 kubenswrapper[7642]: I0319 11:59:12.837595 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.837811 master-0 kubenswrapper[7642]: I0319 11:59:12.837692 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgsm4\" (UniqueName: \"kubernetes.io/projected/46108693-3c9e-4804-beb0-d6b8f8df7625-kube-api-access-wgsm4\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.838259 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8qjv\" (UniqueName: \"kubernetes.io/projected/57d19c68-b469-4a6f-abe0-3a0eff32639c-kube-api-access-v8qjv\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.838329 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstrq\" (UniqueName: \"kubernetes.io/projected/f93ee7db-f0ed-470f-a458-86c54e6d064a-kube-api-access-jstrq\") pod \"network-check-source-b4bf74f6-5jjzp\" (UID: \"f93ee7db-f0ed-470f-a458-86c54e6d064a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.838357 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-default-certificate\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.839146 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.839277 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-stats-auth\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.839333 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-52ww6\" (UID: \"8117f6b0-0b9f-4a1d-9807-bf73c19f388b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:12.840687 master-0 kubenswrapper[7642]: I0319 11:59:12.840314 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.846443 master-0 kubenswrapper[7642]: I0319 11:59:12.846355 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-stats-auth\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.846443 master-0 kubenswrapper[7642]: I0319 11:59:12.846410 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.850021 master-0 kubenswrapper[7642]: I0319 11:59:12.849949 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-default-certificate\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.854176 master-0 kubenswrapper[7642]: I0319 11:59:12.854139 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgsm4\" (UniqueName: \"kubernetes.io/projected/46108693-3c9e-4804-beb0-d6b8f8df7625-kube-api-access-wgsm4\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.942560 master-0 kubenswrapper[7642]: I0319 11:59:12.941510 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstrq\" (UniqueName: \"kubernetes.io/projected/f93ee7db-f0ed-470f-a458-86c54e6d064a-kube-api-access-jstrq\") pod \"network-check-source-b4bf74f6-5jjzp\" (UID: \"f93ee7db-f0ed-470f-a458-86c54e6d064a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 11:59:12.943094 master-0 kubenswrapper[7642]: I0319 11:59:12.943039 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-52ww6\" (UID: \"8117f6b0-0b9f-4a1d-9807-bf73c19f388b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:12.943165 master-0 kubenswrapper[7642]: I0319 11:59:12.943140 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.943325 master-0 kubenswrapper[7642]: I0319 11:59:12.943296 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qjv\" (UniqueName: \"kubernetes.io/projected/57d19c68-b469-4a6f-abe0-3a0eff32639c-kube-api-access-v8qjv\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.948588 master-0 kubenswrapper[7642]: I0319 11:59:12.948533 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.950412 master-0 kubenswrapper[7642]: I0319 11:59:12.950380 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-52ww6\" (UID: \"8117f6b0-0b9f-4a1d-9807-bf73c19f388b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:12.958573 master-0 kubenswrapper[7642]: I0319 11:59:12.958525 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstrq\" (UniqueName: \"kubernetes.io/projected/f93ee7db-f0ed-470f-a458-86c54e6d064a-kube-api-access-jstrq\") pod \"network-check-source-b4bf74f6-5jjzp\" (UID: \"f93ee7db-f0ed-470f-a458-86c54e6d064a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 11:59:12.959332 master-0 kubenswrapper[7642]: I0319 11:59:12.959302 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qjv\" (UniqueName: \"kubernetes.io/projected/57d19c68-b469-4a6f-abe0-3a0eff32639c-kube-api-access-v8qjv\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:12.968601 master-0 kubenswrapper[7642]: I0319 11:59:12.968556 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:12.982222 master-0 kubenswrapper[7642]: I0319 11:59:12.982173 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 11:59:12.984474 master-0 kubenswrapper[7642]: W0319 11:59:12.984398 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46108693_3c9e_4804_beb0_d6b8f8df7625.slice/crio-e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5 WatchSource:0}: Error finding container e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5: Status 404 returned error can't find the container with id e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5 Mar 19 11:59:12.989872 master-0 kubenswrapper[7642]: I0319 11:59:12.989845 7642 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:59:12.996833 master-0 kubenswrapper[7642]: I0319 11:59:12.996611 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:13.016999 master-0 kubenswrapper[7642]: I0319 11:59:13.016454 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 11:59:13.186036 master-0 kubenswrapper[7642]: I0319 11:59:13.185940 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"ba3633214cc572239baa8cdfe50f5588ac58b20470e03aa47725744142808ca4"} Mar 19 11:59:13.186277 master-0 kubenswrapper[7642]: I0319 11:59:13.186058 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"285371d8dc1391b2aefa34f9aa23ab25dae7de5c3d27157be1b42472a35ec621"} Mar 19 11:59:13.186277 master-0 kubenswrapper[7642]: I0319 11:59:13.186083 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217"} Mar 19 11:59:13.187915 master-0 kubenswrapper[7642]: I0319 11:59:13.187854 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5"} Mar 19 11:59:13.208046 master-0 kubenswrapper[7642]: I0319 11:59:13.207950 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" podStartSLOduration=2.207926972 podStartE2EDuration="2.207926972s" podCreationTimestamp="2026-03-19 11:59:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:13.203911905 +0000 UTC m=+311.321352692" watchObservedRunningTime="2026-03-19 11:59:13.207926972 +0000 UTC m=+311.325367759" Mar 19 11:59:13.458550 master-0 kubenswrapper[7642]: I0319 11:59:13.458492 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp"] Mar 19 11:59:13.466893 master-0 kubenswrapper[7642]: W0319 11:59:13.466839 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93ee7db_f0ed_470f_a458_86c54e6d064a.slice/crio-e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6 WatchSource:0}: Error finding container e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6: Status 404 returned error can't find the container with id e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6 Mar 19 11:59:13.540801 master-0 kubenswrapper[7642]: I0319 11:59:13.540731 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tzfl9"] Mar 19 11:59:13.555515 master-0 kubenswrapper[7642]: W0319 11:59:13.555452 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d19c68_b469_4a6f_abe0_3a0eff32639c.slice/crio-40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3 WatchSource:0}: Error finding container 40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3: Status 404 returned error can't find the container with id 40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3 Mar 19 11:59:13.558118 master-0 kubenswrapper[7642]: I0319 11:59:13.557933 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6"] Mar 19 11:59:13.563792 master-0 kubenswrapper[7642]: W0319 11:59:13.563697 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8117f6b0_0b9f_4a1d_9807_bf73c19f388b.slice/crio-444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d WatchSource:0}: Error finding container 444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d: Status 404 returned error can't find the container with id 444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d Mar 19 11:59:14.167707 master-0 kubenswrapper[7642]: I0319 11:59:14.167619 7642 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 11:59:14.205341 master-0 kubenswrapper[7642]: I0319 11:59:14.205219 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" event={"ID":"8117f6b0-0b9f-4a1d-9807-bf73c19f388b","Type":"ContainerStarted","Data":"444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d"} Mar 19 11:59:14.208677 master-0 kubenswrapper[7642]: I0319 11:59:14.207527 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" event={"ID":"f93ee7db-f0ed-470f-a458-86c54e6d064a","Type":"ContainerStarted","Data":"d568f722cd6e7340e98eb715f4b1221131491d92ea5892b562e44ef11c0035d7"} Mar 19 11:59:14.208677 master-0 kubenswrapper[7642]: I0319 11:59:14.207623 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" event={"ID":"f93ee7db-f0ed-470f-a458-86c54e6d064a","Type":"ContainerStarted","Data":"e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6"} Mar 19 11:59:14.214365 master-0 kubenswrapper[7642]: I0319 11:59:14.214274 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tzfl9" event={"ID":"57d19c68-b469-4a6f-abe0-3a0eff32639c","Type":"ContainerStarted","Data":"e933a6546f56cf1635a68ba4de166dde3fd5d42b041b6441ff790cd2b34fc43c"} Mar 19 11:59:14.214365 master-0 kubenswrapper[7642]: I0319 11:59:14.214333 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tzfl9" event={"ID":"57d19c68-b469-4a6f-abe0-3a0eff32639c","Type":"ContainerStarted","Data":"40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3"} Mar 19 11:59:14.245241 master-0 kubenswrapper[7642]: I0319 11:59:14.245151 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" podStartSLOduration=367.24512548 podStartE2EDuration="6m7.24512548s" podCreationTimestamp="2026-03-19 11:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:14.242989328 +0000 UTC m=+312.360430115" watchObservedRunningTime="2026-03-19 11:59:14.24512548 +0000 UTC m=+312.362566267" Mar 19 11:59:14.293958 master-0 kubenswrapper[7642]: I0319 11:59:14.293840 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tzfl9" podStartSLOduration=2.293807188 podStartE2EDuration="2.293807188s" podCreationTimestamp="2026-03-19 11:59:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:14.287884704 +0000 UTC m=+312.405325491" watchObservedRunningTime="2026-03-19 11:59:14.293807188 +0000 UTC m=+312.411247985" Mar 19 11:59:16.237207 master-0 kubenswrapper[7642]: I0319 11:59:16.237040 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"12c2dd42c42f12d98f0581174f1fb6e51a5a2951700534c282b1aaa4cb3c5149"} Mar 19 11:59:16.238957 master-0 kubenswrapper[7642]: I0319 11:59:16.238904 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" event={"ID":"8117f6b0-0b9f-4a1d-9807-bf73c19f388b","Type":"ContainerStarted","Data":"79954f84dfc2314904bbb3f98c0c06dea0bbe35428696cef252929fc38f3689e"} Mar 19 11:59:16.239211 master-0 kubenswrapper[7642]: I0319 11:59:16.239180 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:16.243830 master-0 kubenswrapper[7642]: I0319 11:59:16.243779 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 11:59:16.263072 master-0 kubenswrapper[7642]: I0319 11:59:16.262973 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podStartSLOduration=257.280449979 podStartE2EDuration="4m20.262951746s" podCreationTimestamp="2026-03-19 11:54:56 +0000 UTC" firstStartedPulling="2026-03-19 11:59:12.989786055 +0000 UTC m=+311.107226842" lastFinishedPulling="2026-03-19 11:59:15.972287822 +0000 UTC m=+314.089728609" observedRunningTime="2026-03-19 11:59:16.2603415 +0000 UTC m=+314.377782317" watchObservedRunningTime="2026-03-19 11:59:16.262951746 +0000 UTC m=+314.380392533" Mar 19 11:59:16.289698 master-0 kubenswrapper[7642]: I0319 11:59:16.283176 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" podStartSLOduration=258.883362222 podStartE2EDuration="4m21.283155489s" podCreationTimestamp="2026-03-19 11:54:55 +0000 UTC" firstStartedPulling="2026-03-19 11:59:13.565415167 +0000 UTC m=+311.682855944" lastFinishedPulling="2026-03-19 11:59:15.965208424 +0000 UTC m=+314.082649211" observedRunningTime="2026-03-19 11:59:16.277964017 +0000 UTC m=+314.395404804" watchObservedRunningTime="2026-03-19 11:59:16.283155489 +0000 UTC m=+314.400596276" Mar 19 11:59:16.573317 master-0 kubenswrapper[7642]: I0319 11:59:16.573177 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x66j8"] Mar 19 11:59:16.574728 master-0 kubenswrapper[7642]: I0319 11:59:16.574707 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.581419 master-0 kubenswrapper[7642]: I0319 11:59:16.577236 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 11:59:16.581419 master-0 kubenswrapper[7642]: I0319 11:59:16.580266 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 11:59:16.581419 master-0 kubenswrapper[7642]: I0319 11:59:16.580331 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-f27d9" Mar 19 11:59:16.705677 master-0 kubenswrapper[7642]: I0319 11:59:16.705585 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.705951 master-0 kubenswrapper[7642]: I0319 11:59:16.705694 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5z4r\" (UniqueName: \"kubernetes.io/projected/a49a8a5e-23fb-46fb-b184-e09da4817028-kube-api-access-k5z4r\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.705951 master-0 kubenswrapper[7642]: I0319 11:59:16.705737 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.807375 master-0 kubenswrapper[7642]: I0319 11:59:16.807300 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.807602 master-0 kubenswrapper[7642]: I0319 11:59:16.807520 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5z4r\" (UniqueName: \"kubernetes.io/projected/a49a8a5e-23fb-46fb-b184-e09da4817028-kube-api-access-k5z4r\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.807602 master-0 kubenswrapper[7642]: I0319 11:59:16.807550 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.811235 master-0 kubenswrapper[7642]: I0319 11:59:16.811178 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.811731 master-0 kubenswrapper[7642]: I0319 11:59:16.811690 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.825155 master-0 kubenswrapper[7642]: I0319 11:59:16.825044 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5z4r\" (UniqueName: \"kubernetes.io/projected/a49a8a5e-23fb-46fb-b184-e09da4817028-kube-api-access-k5z4r\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.899164 master-0 kubenswrapper[7642]: I0319 11:59:16.899097 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 11:59:16.931053 master-0 kubenswrapper[7642]: W0319 11:59:16.930957 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49a8a5e_23fb_46fb_b184_e09da4817028.slice/crio-a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084 WatchSource:0}: Error finding container a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084: Status 404 returned error can't find the container with id a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084 Mar 19 11:59:16.969958 master-0 kubenswrapper[7642]: I0319 11:59:16.969886 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:16.972381 master-0 kubenswrapper[7642]: I0319 11:59:16.972314 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:16.972381 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:16.972381 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:16.972381 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:16.972601 master-0 kubenswrapper[7642]: I0319 11:59:16.972397 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:17.079405 master-0 kubenswrapper[7642]: I0319 11:59:17.079284 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv"] Mar 19 11:59:17.083812 master-0 kubenswrapper[7642]: I0319 11:59:17.083768 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.085620 master-0 kubenswrapper[7642]: I0319 11:59:17.085574 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 11:59:17.085785 master-0 kubenswrapper[7642]: I0319 11:59:17.085749 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2dl2n" Mar 19 11:59:17.086763 master-0 kubenswrapper[7642]: I0319 11:59:17.086732 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 11:59:17.088107 master-0 kubenswrapper[7642]: I0319 11:59:17.088076 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 11:59:17.097750 master-0 kubenswrapper[7642]: I0319 11:59:17.097674 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv"] Mar 19 11:59:17.216571 master-0 kubenswrapper[7642]: I0319 11:59:17.216518 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.216901 master-0 kubenswrapper[7642]: I0319 11:59:17.216878 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.217051 master-0 kubenswrapper[7642]: I0319 11:59:17.217030 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.217208 master-0 kubenswrapper[7642]: I0319 11:59:17.217189 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6qg\" (UniqueName: \"kubernetes.io/projected/b92c4c09-a796-4e00-821a-32fc9c8ca214-kube-api-access-6h6qg\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.246056 master-0 kubenswrapper[7642]: I0319 11:59:17.245978 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x66j8" event={"ID":"a49a8a5e-23fb-46fb-b184-e09da4817028","Type":"ContainerStarted","Data":"a8c7d4a5bc868c6b8326143c35e5db20aa58db16a5dbc92a5b41ae595b41f4c0"} Mar 19 11:59:17.246056 master-0 kubenswrapper[7642]: I0319 11:59:17.246039 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x66j8" event={"ID":"a49a8a5e-23fb-46fb-b184-e09da4817028","Type":"ContainerStarted","Data":"a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084"} Mar 19 11:59:17.262024 master-0 kubenswrapper[7642]: I0319 11:59:17.261943 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x66j8" podStartSLOduration=1.261917613 podStartE2EDuration="1.261917613s" podCreationTimestamp="2026-03-19 11:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:17.261161251 +0000 UTC m=+315.378602038" watchObservedRunningTime="2026-03-19 11:59:17.261917613 +0000 UTC m=+315.379358410" Mar 19 11:59:17.319050 master-0 kubenswrapper[7642]: I0319 11:59:17.318923 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6qg\" (UniqueName: \"kubernetes.io/projected/b92c4c09-a796-4e00-821a-32fc9c8ca214-kube-api-access-6h6qg\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.319050 master-0 kubenswrapper[7642]: I0319 11:59:17.319013 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.319050 master-0 kubenswrapper[7642]: I0319 11:59:17.319051 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.319609 master-0 kubenswrapper[7642]: I0319 11:59:17.319545 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.320779 master-0 kubenswrapper[7642]: I0319 11:59:17.320702 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.323061 master-0 kubenswrapper[7642]: I0319 11:59:17.322700 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.328201 master-0 kubenswrapper[7642]: I0319 11:59:17.328168 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.340823 master-0 kubenswrapper[7642]: I0319 11:59:17.340713 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6qg\" (UniqueName: \"kubernetes.io/projected/b92c4c09-a796-4e00-821a-32fc9c8ca214-kube-api-access-6h6qg\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.403925 master-0 kubenswrapper[7642]: I0319 11:59:17.403857 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 11:59:17.828790 master-0 kubenswrapper[7642]: I0319 11:59:17.828745 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv"] Mar 19 11:59:17.834452 master-0 kubenswrapper[7642]: W0319 11:59:17.834395 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92c4c09_a796_4e00_821a_32fc9c8ca214.slice/crio-2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c WatchSource:0}: Error finding container 2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c: Status 404 returned error can't find the container with id 2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c Mar 19 11:59:17.971649 master-0 kubenswrapper[7642]: I0319 11:59:17.971596 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:17.971649 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:17.971649 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:17.971649 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:17.971936 master-0 kubenswrapper[7642]: I0319 11:59:17.971692 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:18.068289 master-0 kubenswrapper[7642]: I0319 11:59:18.068221 7642 scope.go:117] "RemoveContainer" containerID="7316631a0f45a2c1ef1105021be26127f5494db22ff552f0eb4f6711db06819f" Mar 19 11:59:18.255115 master-0 kubenswrapper[7642]: I0319 11:59:18.255071 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/1.log" Mar 19 11:59:18.256261 master-0 kubenswrapper[7642]: I0319 11:59:18.256235 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad"} Mar 19 11:59:18.257620 master-0 kubenswrapper[7642]: I0319 11:59:18.257578 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" event={"ID":"b92c4c09-a796-4e00-821a-32fc9c8ca214","Type":"ContainerStarted","Data":"2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c"} Mar 19 11:59:18.971739 master-0 kubenswrapper[7642]: I0319 11:59:18.971677 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:18.971739 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:18.971739 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:18.971739 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:18.971739 master-0 kubenswrapper[7642]: I0319 11:59:18.971737 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:19.972596 master-0 kubenswrapper[7642]: I0319 11:59:19.972504 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:19.972596 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:19.972596 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:19.972596 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:19.973512 master-0 kubenswrapper[7642]: I0319 11:59:19.972658 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:20.971676 master-0 kubenswrapper[7642]: I0319 11:59:20.971580 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:20.971676 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:20.971676 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:20.971676 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:20.972107 master-0 kubenswrapper[7642]: I0319 11:59:20.971697 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:21.279569 master-0 kubenswrapper[7642]: I0319 11:59:21.279387 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" event={"ID":"b92c4c09-a796-4e00-821a-32fc9c8ca214","Type":"ContainerStarted","Data":"45c789365c80ca55d3226f6deecb6d6efc3f93acf4b17785efdd5266a6618954"} Mar 19 11:59:21.279569 master-0 kubenswrapper[7642]: I0319 11:59:21.279450 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" event={"ID":"b92c4c09-a796-4e00-821a-32fc9c8ca214","Type":"ContainerStarted","Data":"6a323fe5657fde9c52d45dc374ed9dd1c7bbeca037d1a14be52c107d4d949809"} Mar 19 11:59:21.305376 master-0 kubenswrapper[7642]: I0319 11:59:21.305263 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" podStartSLOduration=1.812040263 podStartE2EDuration="4.30523277s" podCreationTimestamp="2026-03-19 11:59:17 +0000 UTC" firstStartedPulling="2026-03-19 11:59:17.836905206 +0000 UTC m=+315.954345993" lastFinishedPulling="2026-03-19 11:59:20.330097713 +0000 UTC m=+318.447538500" observedRunningTime="2026-03-19 11:59:21.30148495 +0000 UTC m=+319.418925737" watchObservedRunningTime="2026-03-19 11:59:21.30523277 +0000 UTC m=+319.422673567" Mar 19 11:59:21.972282 master-0 kubenswrapper[7642]: I0319 11:59:21.972167 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:21.972282 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:21.972282 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:21.972282 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:21.972971 master-0 kubenswrapper[7642]: I0319 11:59:21.972323 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:22.969402 master-0 kubenswrapper[7642]: I0319 11:59:22.969298 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 11:59:22.973850 master-0 kubenswrapper[7642]: I0319 11:59:22.973786 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:22.973850 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:22.973850 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:22.973850 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:22.974096 master-0 kubenswrapper[7642]: I0319 11:59:22.973894 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:23.436539 master-0 kubenswrapper[7642]: I0319 11:59:23.436465 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-d48mb"] Mar 19 11:59:23.438007 master-0 kubenswrapper[7642]: I0319 11:59:23.437973 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.448878 master-0 kubenswrapper[7642]: I0319 11:59:23.448812 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 11:59:23.449200 master-0 kubenswrapper[7642]: I0319 11:59:23.449073 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-nghrc" Mar 19 11:59:23.449200 master-0 kubenswrapper[7642]: I0319 11:59:23.449129 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 11:59:23.463921 master-0 kubenswrapper[7642]: I0319 11:59:23.463846 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj"] Mar 19 11:59:23.470158 master-0 kubenswrapper[7642]: I0319 11:59:23.470107 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.472154 master-0 kubenswrapper[7642]: I0319 11:59:23.472093 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-nkt4p" Mar 19 11:59:23.472414 master-0 kubenswrapper[7642]: I0319 11:59:23.472370 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 11:59:23.472533 master-0 kubenswrapper[7642]: I0319 11:59:23.472499 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 11:59:23.514542 master-0 kubenswrapper[7642]: I0319 11:59:23.508169 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj"] Mar 19 11:59:23.550665 master-0 kubenswrapper[7642]: I0319 11:59:23.536839 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x"] Mar 19 11:59:23.550665 master-0 kubenswrapper[7642]: I0319 11:59:23.549589 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x"] Mar 19 11:59:23.550665 master-0 kubenswrapper[7642]: I0319 11:59:23.549794 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.583666 master-0 kubenswrapper[7642]: I0319 11:59:23.581105 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 11:59:23.583666 master-0 kubenswrapper[7642]: I0319 11:59:23.581434 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 11:59:23.583666 master-0 kubenswrapper[7642]: I0319 11:59:23.581526 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-wn2jn" Mar 19 11:59:23.602224 master-0 kubenswrapper[7642]: I0319 11:59:23.591865 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.616646 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-sys\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.616726 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.616777 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.616810 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.616948 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-root\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.616969 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-wtmp\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.617005 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cz7\" (UniqueName: \"kubernetes.io/projected/5f2d1d03-7427-45a0-a917-173bf9df9902-kube-api-access-z7cz7\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.617023 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.617045 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.617070 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.617093 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-textfile\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.623669 master-0 kubenswrapper[7642]: I0319 11:59:23.617114 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9hnq\" (UniqueName: \"kubernetes.io/projected/09282e98-3140-41c9-863f-67e0e9f942cc-kube-api-access-x9hnq\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.719017 master-0 kubenswrapper[7642]: I0319 11:59:23.718859 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.719238 master-0 kubenswrapper[7642]: I0319 11:59:23.719187 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.719389 master-0 kubenswrapper[7642]: I0319 11:59:23.719359 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.719460 master-0 kubenswrapper[7642]: I0319 11:59:23.719440 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-root\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.719542 master-0 kubenswrapper[7642]: I0319 11:59:23.719521 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-root\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.719618 master-0 kubenswrapper[7642]: I0319 11:59:23.719569 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-wtmp\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.719697 master-0 kubenswrapper[7642]: I0319 11:59:23.719651 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.719750 master-0 kubenswrapper[7642]: I0319 11:59:23.719713 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ca74427-9df7-44ca-ae91-0162f402644b-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.719750 master-0 kubenswrapper[7642]: I0319 11:59:23.719740 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.719861 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cz7\" (UniqueName: \"kubernetes.io/projected/5f2d1d03-7427-45a0-a917-173bf9df9902-kube-api-access-z7cz7\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.719930 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.719919 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-wtmp\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.719956 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.720063 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: E0319 11:59:23.720067 7642 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.720071 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: E0319 11:59:23.720165 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls podName:5f2d1d03-7427-45a0-a917-173bf9df9902 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:24.220142212 +0000 UTC m=+322.337582999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls") pod "node-exporter-d48mb" (UID: "5f2d1d03-7427-45a0-a917-173bf9df9902") : secret "node-exporter-tls" not found Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.720210 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720251 master-0 kubenswrapper[7642]: I0319 11:59:23.720265 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-textfile\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720756 master-0 kubenswrapper[7642]: I0319 11:59:23.720315 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9hnq\" (UniqueName: \"kubernetes.io/projected/09282e98-3140-41c9-863f-67e0e9f942cc-kube-api-access-x9hnq\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.720756 master-0 kubenswrapper[7642]: I0319 11:59:23.720350 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69pp5\" (UniqueName: \"kubernetes.io/projected/4ca74427-9df7-44ca-ae91-0162f402644b-kube-api-access-69pp5\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.720756 master-0 kubenswrapper[7642]: I0319 11:59:23.720439 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-sys\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720756 master-0 kubenswrapper[7642]: I0319 11:59:23.720465 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.720756 master-0 kubenswrapper[7642]: I0319 11:59:23.720652 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-textfile\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.720971 master-0 kubenswrapper[7642]: I0319 11:59:23.720781 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-sys\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.724154 master-0 kubenswrapper[7642]: I0319 11:59:23.721454 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.724651 master-0 kubenswrapper[7642]: I0319 11:59:23.724582 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.729660 master-0 kubenswrapper[7642]: I0319 11:59:23.728489 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.733653 master-0 kubenswrapper[7642]: I0319 11:59:23.731083 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.744512 master-0 kubenswrapper[7642]: I0319 11:59:23.744441 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cz7\" (UniqueName: \"kubernetes.io/projected/5f2d1d03-7427-45a0-a917-173bf9df9902-kube-api-access-z7cz7\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:23.749663 master-0 kubenswrapper[7642]: I0319 11:59:23.747161 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9hnq\" (UniqueName: \"kubernetes.io/projected/09282e98-3140-41c9-863f-67e0e9f942cc-kube-api-access-x9hnq\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.791209 master-0 kubenswrapper[7642]: I0319 11:59:23.791152 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 11:59:23.822046 master-0 kubenswrapper[7642]: I0319 11:59:23.821955 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.822155 master-0 kubenswrapper[7642]: I0319 11:59:23.822064 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.822155 master-0 kubenswrapper[7642]: I0319 11:59:23.822110 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ca74427-9df7-44ca-ae91-0162f402644b-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.822155 master-0 kubenswrapper[7642]: I0319 11:59:23.822147 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.822251 master-0 kubenswrapper[7642]: I0319 11:59:23.822206 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.822290 master-0 kubenswrapper[7642]: I0319 11:59:23.822257 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69pp5\" (UniqueName: \"kubernetes.io/projected/4ca74427-9df7-44ca-ae91-0162f402644b-kube-api-access-69pp5\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.823378 master-0 kubenswrapper[7642]: I0319 11:59:23.823343 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ca74427-9df7-44ca-ae91-0162f402644b-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.825139 master-0 kubenswrapper[7642]: I0319 11:59:23.825074 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.826082 master-0 kubenswrapper[7642]: I0319 11:59:23.826054 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.828347 master-0 kubenswrapper[7642]: I0319 11:59:23.828300 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.829370 master-0 kubenswrapper[7642]: I0319 11:59:23.829295 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.848284 master-0 kubenswrapper[7642]: I0319 11:59:23.848218 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69pp5\" (UniqueName: \"kubernetes.io/projected/4ca74427-9df7-44ca-ae91-0162f402644b-kube-api-access-69pp5\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.948918 master-0 kubenswrapper[7642]: I0319 11:59:23.948385 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 11:59:23.971251 master-0 kubenswrapper[7642]: I0319 11:59:23.971130 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:23.971251 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:23.971251 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:23.971251 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:23.971251 master-0 kubenswrapper[7642]: I0319 11:59:23.971227 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:24.232335 master-0 kubenswrapper[7642]: I0319 11:59:24.232190 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:24.232864 master-0 kubenswrapper[7642]: E0319 11:59:24.232715 7642 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 19 11:59:24.232864 master-0 kubenswrapper[7642]: E0319 11:59:24.232837 7642 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls podName:5f2d1d03-7427-45a0-a917-173bf9df9902 nodeName:}" failed. No retries permitted until 2026-03-19 11:59:25.232812497 +0000 UTC m=+323.350253284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls") pod "node-exporter-d48mb" (UID: "5f2d1d03-7427-45a0-a917-173bf9df9902") : secret "node-exporter-tls" not found Mar 19 11:59:24.522807 master-0 kubenswrapper[7642]: I0319 11:59:24.522329 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj"] Mar 19 11:59:24.841999 master-0 kubenswrapper[7642]: I0319 11:59:24.841936 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x"] Mar 19 11:59:24.849940 master-0 kubenswrapper[7642]: W0319 11:59:24.849881 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca74427_9df7_44ca_ae91_0162f402644b.slice/crio-9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94 WatchSource:0}: Error finding container 9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94: Status 404 returned error can't find the container with id 9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94 Mar 19 11:59:24.971822 master-0 kubenswrapper[7642]: I0319 11:59:24.971754 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:24.971822 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:24.971822 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:24.971822 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:24.972457 master-0 kubenswrapper[7642]: I0319 11:59:24.971841 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:25.248013 master-0 kubenswrapper[7642]: I0319 11:59:25.247919 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:25.252093 master-0 kubenswrapper[7642]: I0319 11:59:25.252024 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:25.266914 master-0 kubenswrapper[7642]: I0319 11:59:25.266861 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d48mb" Mar 19 11:59:25.293944 master-0 kubenswrapper[7642]: W0319 11:59:25.293856 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f2d1d03_7427_45a0_a917_173bf9df9902.slice/crio-64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924 WatchSource:0}: Error finding container 64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924: Status 404 returned error can't find the container with id 64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924 Mar 19 11:59:25.306504 master-0 kubenswrapper[7642]: I0319 11:59:25.306448 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerStarted","Data":"64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924"} Mar 19 11:59:25.307994 master-0 kubenswrapper[7642]: I0319 11:59:25.307962 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94"} Mar 19 11:59:25.309751 master-0 kubenswrapper[7642]: I0319 11:59:25.309714 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"cd74682d52422c519cea30b441c3a0e7c9f2dc07a0f3f2728bf32cc48f37ec2b"} Mar 19 11:59:25.309842 master-0 kubenswrapper[7642]: I0319 11:59:25.309751 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"9d99fa05c7a5c6fa4b07e9ced3544d4c3af285d39eefa7a5647c21477748c4fd"} Mar 19 11:59:25.309842 master-0 kubenswrapper[7642]: I0319 11:59:25.309766 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"c30aed6efb2687e8c9c0787c5b85c3ecc5ac073733a122b5233bd897825e7a16"} Mar 19 11:59:25.971285 master-0 kubenswrapper[7642]: I0319 11:59:25.971229 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:25.971285 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:25.971285 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:25.971285 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:25.971597 master-0 kubenswrapper[7642]: I0319 11:59:25.971309 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:26.971732 master-0 kubenswrapper[7642]: I0319 11:59:26.971660 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:26.971732 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:26.971732 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:26.971732 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:26.972446 master-0 kubenswrapper[7642]: I0319 11:59:26.971744 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:27.973459 master-0 kubenswrapper[7642]: I0319 11:59:27.973200 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:27.973459 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:27.973459 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:27.973459 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:27.973459 master-0 kubenswrapper[7642]: I0319 11:59:27.973305 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:28.685657 master-0 kubenswrapper[7642]: I0319 11:59:28.685125 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 11:59:28.686859 master-0 kubenswrapper[7642]: I0319 11:59:28.686836 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.689373 master-0 kubenswrapper[7642]: I0319 11:59:28.689331 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-w4vjc" Mar 19 11:59:28.689578 master-0 kubenswrapper[7642]: I0319 11:59:28.689555 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 11:59:28.696619 master-0 kubenswrapper[7642]: I0319 11:59:28.695863 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 11:59:28.819597 master-0 kubenswrapper[7642]: I0319 11:59:28.819529 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec99218-1644-461e-abb2-14d0c5369e96-kube-api-access\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.819597 master-0 kubenswrapper[7642]: I0319 11:59:28.819604 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-var-lock\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.819983 master-0 kubenswrapper[7642]: I0319 11:59:28.819684 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.839449 master-0 kubenswrapper[7642]: I0319 11:59:28.839371 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-rjdjl"] Mar 19 11:59:28.840750 master-0 kubenswrapper[7642]: I0319 11:59:28.840716 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.848179 master-0 kubenswrapper[7642]: I0319 11:59:28.848112 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 11:59:28.848453 master-0 kubenswrapper[7642]: I0319 11:59:28.848388 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-4rhqw" Mar 19 11:59:28.848735 master-0 kubenswrapper[7642]: I0319 11:59:28.848706 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 11:59:28.861799 master-0 kubenswrapper[7642]: I0319 11:59:28.861686 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 11:59:28.862013 master-0 kubenswrapper[7642]: I0319 11:59:28.861971 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 11:59:28.886789 master-0 kubenswrapper[7642]: I0319 11:59:28.886719 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 11:59:28.912037 master-0 kubenswrapper[7642]: I0319 11:59:28.911966 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 11:59:28.927187 master-0 kubenswrapper[7642]: I0319 11:59:28.926991 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-rjdjl"] Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.945781 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-telemeter-client-tls\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.945853 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-serving-certs-ca-bundle\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.945898 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-federate-client-tls\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.945920 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.945958 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946022 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946047 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-metrics-client-ca\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946076 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec99218-1644-461e-abb2-14d0c5369e96-kube-api-access\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946139 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-var-lock\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946164 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946196 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwx2w\" (UniqueName: \"kubernetes.io/projected/596a0930-a082-4374-acc9-b722c4970443-kube-api-access-xwx2w\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946409 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.947673 master-0 kubenswrapper[7642]: I0319 11:59:28.946794 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-var-lock\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:28.980674 master-0 kubenswrapper[7642]: I0319 11:59:28.978109 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:28.980674 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:28.980674 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:28.980674 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:28.980674 master-0 kubenswrapper[7642]: I0319 11:59:28.978195 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:28.984594 master-0 kubenswrapper[7642]: I0319 11:59:28.984535 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec99218-1644-461e-abb2-14d0c5369e96-kube-api-access\") pod \"installer-4-master-0\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:29.041089 master-0 kubenswrapper[7642]: I0319 11:59:29.041006 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-dc85487cf-gsmlh"] Mar 19 11:59:29.042011 master-0 kubenswrapper[7642]: I0319 11:59:29.041962 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.045986 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.046323 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.046452 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.046598 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-atris4eemsst" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.046764 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.046894 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-g78hg" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.048979 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.049092 master-0 kubenswrapper[7642]: I0319 11:59:29.049020 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-metrics-client-ca\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051367 master-0 kubenswrapper[7642]: I0319 11:59:29.049946 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051367 master-0 kubenswrapper[7642]: I0319 11:59:29.050031 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwx2w\" (UniqueName: \"kubernetes.io/projected/596a0930-a082-4374-acc9-b722c4970443-kube-api-access-xwx2w\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051367 master-0 kubenswrapper[7642]: I0319 11:59:29.050107 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-telemeter-client-tls\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051367 master-0 kubenswrapper[7642]: I0319 11:59:29.050144 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-serving-certs-ca-bundle\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051367 master-0 kubenswrapper[7642]: I0319 11:59:29.050214 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-federate-client-tls\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051367 master-0 kubenswrapper[7642]: I0319 11:59:29.050243 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051778 master-0 kubenswrapper[7642]: I0319 11:59:29.051402 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051778 master-0 kubenswrapper[7642]: I0319 11:59:29.051540 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-serving-certs-ca-bundle\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.051778 master-0 kubenswrapper[7642]: I0319 11:59:29.051737 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-metrics-client-ca\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.053196 master-0 kubenswrapper[7642]: I0319 11:59:29.053151 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.058125 master-0 kubenswrapper[7642]: I0319 11:59:29.058076 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-federate-client-tls\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.058377 master-0 kubenswrapper[7642]: I0319 11:59:29.058341 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-telemeter-client-tls\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.058741 master-0 kubenswrapper[7642]: I0319 11:59:29.058703 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.061432 master-0 kubenswrapper[7642]: I0319 11:59:29.061371 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-dc85487cf-gsmlh"] Mar 19 11:59:29.076876 master-0 kubenswrapper[7642]: I0319 11:59:29.076810 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 11:59:29.082133 master-0 kubenswrapper[7642]: I0319 11:59:29.081472 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwx2w\" (UniqueName: \"kubernetes.io/projected/596a0930-a082-4374-acc9-b722c4970443-kube-api-access-xwx2w\") pod \"telemeter-client-5b685db9f8-rjdjl\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.151933 master-0 kubenswrapper[7642]: I0319 11:59:29.151526 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.151933 master-0 kubenswrapper[7642]: I0319 11:59:29.151587 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.151933 master-0 kubenswrapper[7642]: I0319 11:59:29.151652 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.151933 master-0 kubenswrapper[7642]: I0319 11:59:29.151730 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.158519 master-0 kubenswrapper[7642]: I0319 11:59:29.152415 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.158519 master-0 kubenswrapper[7642]: I0319 11:59:29.152508 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.158519 master-0 kubenswrapper[7642]: I0319 11:59:29.152690 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.182522 master-0 kubenswrapper[7642]: I0319 11:59:29.177577 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 11:59:29.254749 master-0 kubenswrapper[7642]: I0319 11:59:29.254682 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.254749 master-0 kubenswrapper[7642]: I0319 11:59:29.254741 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.255018 master-0 kubenswrapper[7642]: I0319 11:59:29.254789 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.255052 master-0 kubenswrapper[7642]: I0319 11:59:29.254999 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.255365 master-0 kubenswrapper[7642]: I0319 11:59:29.255306 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.255579 master-0 kubenswrapper[7642]: I0319 11:59:29.255559 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.255754 master-0 kubenswrapper[7642]: I0319 11:59:29.255728 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.255791 master-0 kubenswrapper[7642]: I0319 11:59:29.255734 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.256153 master-0 kubenswrapper[7642]: I0319 11:59:29.256121 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.256517 master-0 kubenswrapper[7642]: I0319 11:59:29.256483 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.259243 master-0 kubenswrapper[7642]: I0319 11:59:29.259209 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.259415 master-0 kubenswrapper[7642]: I0319 11:59:29.259375 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.260013 master-0 kubenswrapper[7642]: I0319 11:59:29.259983 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.277434 master-0 kubenswrapper[7642]: I0319 11:59:29.277389 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.346248 master-0 kubenswrapper[7642]: I0319 11:59:29.346158 7642 generic.go:334] "Generic (PLEG): container finished" podID="5f2d1d03-7427-45a0-a917-173bf9df9902" containerID="3b0c232c84654528ea64bdd18d2136082e46bb45cce0608d6a300d9ff0fe9984" exitCode=0 Mar 19 11:59:29.346248 master-0 kubenswrapper[7642]: I0319 11:59:29.346243 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerDied","Data":"3b0c232c84654528ea64bdd18d2136082e46bb45cce0608d6a300d9ff0fe9984"} Mar 19 11:59:29.350757 master-0 kubenswrapper[7642]: I0319 11:59:29.350687 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"33626a3a03e494d53eb0b45e3f0e9d0fade66c1dbdbd8409a2239da74bbd981e"} Mar 19 11:59:29.350837 master-0 kubenswrapper[7642]: I0319 11:59:29.350764 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"f723a5b08f2ef8557d64dea29d9c9752cf6c0d407586477022828b2cd1778ed9"} Mar 19 11:59:29.350837 master-0 kubenswrapper[7642]: I0319 11:59:29.350779 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"bc3aceab17b3c160a457b9b51da1c86ee3e1e7b578b1ef933673c5b54f7b7534"} Mar 19 11:59:29.355954 master-0 kubenswrapper[7642]: I0319 11:59:29.353148 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"7b2fe012b6b7f7dc2d751bb678215da259af72548d5ad40a959bc935d0c78ee7"} Mar 19 11:59:29.365665 master-0 kubenswrapper[7642]: I0319 11:59:29.365567 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:29.407241 master-0 kubenswrapper[7642]: I0319 11:59:29.406865 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" podStartSLOduration=2.811810463 podStartE2EDuration="6.406832253s" podCreationTimestamp="2026-03-19 11:59:23 +0000 UTC" firstStartedPulling="2026-03-19 11:59:24.853212581 +0000 UTC m=+322.970653368" lastFinishedPulling="2026-03-19 11:59:28.448234371 +0000 UTC m=+326.565675158" observedRunningTime="2026-03-19 11:59:29.396409468 +0000 UTC m=+327.513850275" watchObservedRunningTime="2026-03-19 11:59:29.406832253 +0000 UTC m=+327.524273040" Mar 19 11:59:29.435472 master-0 kubenswrapper[7642]: I0319 11:59:29.434874 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" podStartSLOduration=2.837163127 podStartE2EDuration="6.434848655s" podCreationTimestamp="2026-03-19 11:59:23 +0000 UTC" firstStartedPulling="2026-03-19 11:59:24.854498189 +0000 UTC m=+322.971938976" lastFinishedPulling="2026-03-19 11:59:28.452183717 +0000 UTC m=+326.569624504" observedRunningTime="2026-03-19 11:59:29.432902598 +0000 UTC m=+327.550343405" watchObservedRunningTime="2026-03-19 11:59:29.434848655 +0000 UTC m=+327.552289442" Mar 19 11:59:29.497165 master-0 kubenswrapper[7642]: I0319 11:59:29.497110 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 19 11:59:29.497937 master-0 kubenswrapper[7642]: W0319 11:59:29.497892 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1ec99218_1644_461e_abb2_14d0c5369e96.slice/crio-f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479 WatchSource:0}: Error finding container f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479: Status 404 returned error can't find the container with id f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479 Mar 19 11:59:29.651367 master-0 kubenswrapper[7642]: I0319 11:59:29.651301 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-rjdjl"] Mar 19 11:59:29.654649 master-0 kubenswrapper[7642]: W0319 11:59:29.654560 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod596a0930_a082_4374_acc9_b722c4970443.slice/crio-504567c8e122d9fd1893a9f58d1661f6e869dcbb3088f4af75c6080ae6904a38 WatchSource:0}: Error finding container 504567c8e122d9fd1893a9f58d1661f6e869dcbb3088f4af75c6080ae6904a38: Status 404 returned error can't find the container with id 504567c8e122d9fd1893a9f58d1661f6e869dcbb3088f4af75c6080ae6904a38 Mar 19 11:59:29.896422 master-0 kubenswrapper[7642]: I0319 11:59:29.896358 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-dc85487cf-gsmlh"] Mar 19 11:59:29.908923 master-0 kubenswrapper[7642]: W0319 11:59:29.908815 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac8d080_b6a9_45f0_97ba_d6fa93e58c5c.slice/crio-4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35 WatchSource:0}: Error finding container 4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35: Status 404 returned error can't find the container with id 4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35 Mar 19 11:59:29.971713 master-0 kubenswrapper[7642]: I0319 11:59:29.971491 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:29.971713 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:29.971713 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:29.971713 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:29.971713 master-0 kubenswrapper[7642]: I0319 11:59:29.971621 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:30.369976 master-0 kubenswrapper[7642]: I0319 11:59:30.369821 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" event={"ID":"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c","Type":"ContainerStarted","Data":"4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35"} Mar 19 11:59:30.375790 master-0 kubenswrapper[7642]: I0319 11:59:30.375746 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerStarted","Data":"20c63945e832497a7fd0819ea7c26a11a69792158beab08ae38dda25ad0ee6af"} Mar 19 11:59:30.375912 master-0 kubenswrapper[7642]: I0319 11:59:30.375793 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerStarted","Data":"9c450dc363933ea47014a9ef4addea98317470b78816983a716d19f7d723e91f"} Mar 19 11:59:30.378650 master-0 kubenswrapper[7642]: I0319 11:59:30.378563 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"1ec99218-1644-461e-abb2-14d0c5369e96","Type":"ContainerStarted","Data":"16da51ad34fc7222c4ea68656e0145a3d1eba549a2e561b21ed00a07283c919f"} Mar 19 11:59:30.380129 master-0 kubenswrapper[7642]: I0319 11:59:30.378698 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"1ec99218-1644-461e-abb2-14d0c5369e96","Type":"ContainerStarted","Data":"f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479"} Mar 19 11:59:30.380289 master-0 kubenswrapper[7642]: I0319 11:59:30.380255 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerStarted","Data":"504567c8e122d9fd1893a9f58d1661f6e869dcbb3088f4af75c6080ae6904a38"} Mar 19 11:59:30.405748 master-0 kubenswrapper[7642]: I0319 11:59:30.404713 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-d48mb" podStartSLOduration=4.253252255 podStartE2EDuration="7.404685747s" podCreationTimestamp="2026-03-19 11:59:23 +0000 UTC" firstStartedPulling="2026-03-19 11:59:25.296374507 +0000 UTC m=+323.413815284" lastFinishedPulling="2026-03-19 11:59:28.447807989 +0000 UTC m=+326.565248776" observedRunningTime="2026-03-19 11:59:30.402251406 +0000 UTC m=+328.519692193" watchObservedRunningTime="2026-03-19 11:59:30.404685747 +0000 UTC m=+328.522126534" Mar 19 11:59:30.431155 master-0 kubenswrapper[7642]: I0319 11:59:30.431026 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.431000939 podStartE2EDuration="2.431000939s" podCreationTimestamp="2026-03-19 11:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:59:30.426863658 +0000 UTC m=+328.544304445" watchObservedRunningTime="2026-03-19 11:59:30.431000939 +0000 UTC m=+328.548441726" Mar 19 11:59:30.971776 master-0 kubenswrapper[7642]: I0319 11:59:30.971562 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:30.971776 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:30.971776 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:30.971776 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:30.971776 master-0 kubenswrapper[7642]: I0319 11:59:30.971654 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:31.972122 master-0 kubenswrapper[7642]: I0319 11:59:31.972042 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:31.972122 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:31.972122 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:31.972122 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:31.973125 master-0 kubenswrapper[7642]: I0319 11:59:31.972183 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:32.971078 master-0 kubenswrapper[7642]: I0319 11:59:32.971012 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:32.971078 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:32.971078 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:32.971078 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:32.971341 master-0 kubenswrapper[7642]: I0319 11:59:32.971141 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:33.413013 master-0 kubenswrapper[7642]: I0319 11:59:33.412936 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerStarted","Data":"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8"} Mar 19 11:59:33.414302 master-0 kubenswrapper[7642]: I0319 11:59:33.414262 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" event={"ID":"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c","Type":"ContainerStarted","Data":"5b15f71769cb166bbe1854fa275ab6231d488a761919449830f9713c4bfd28ed"} Mar 19 11:59:33.786238 master-0 kubenswrapper[7642]: I0319 11:59:33.786064 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" podStartSLOduration=2.889095225 podStartE2EDuration="5.786041842s" podCreationTimestamp="2026-03-19 11:59:28 +0000 UTC" firstStartedPulling="2026-03-19 11:59:29.910574887 +0000 UTC m=+328.028015674" lastFinishedPulling="2026-03-19 11:59:32.807521504 +0000 UTC m=+330.924962291" observedRunningTime="2026-03-19 11:59:33.782403485 +0000 UTC m=+331.899844302" watchObservedRunningTime="2026-03-19 11:59:33.786041842 +0000 UTC m=+331.903482619" Mar 19 11:59:33.972920 master-0 kubenswrapper[7642]: I0319 11:59:33.972850 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:33.972920 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:33.972920 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:33.972920 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:33.973254 master-0 kubenswrapper[7642]: I0319 11:59:33.972952 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:34.971601 master-0 kubenswrapper[7642]: I0319 11:59:34.971529 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:34.971601 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:34.971601 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:34.971601 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:34.972332 master-0 kubenswrapper[7642]: I0319 11:59:34.971610 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:35.971259 master-0 kubenswrapper[7642]: I0319 11:59:35.971174 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:35.971259 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:35.971259 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:35.971259 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:35.972196 master-0 kubenswrapper[7642]: I0319 11:59:35.971255 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:36.444288 master-0 kubenswrapper[7642]: I0319 11:59:36.444240 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerStarted","Data":"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788"} Mar 19 11:59:36.971247 master-0 kubenswrapper[7642]: I0319 11:59:36.971172 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:36.971247 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:36.971247 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:36.971247 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:36.971824 master-0 kubenswrapper[7642]: I0319 11:59:36.971786 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:37.455959 master-0 kubenswrapper[7642]: I0319 11:59:37.455857 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerStarted","Data":"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc"} Mar 19 11:59:37.490838 master-0 kubenswrapper[7642]: I0319 11:59:37.490745 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" podStartSLOduration=3.003769297 podStartE2EDuration="9.490719158s" podCreationTimestamp="2026-03-19 11:59:28 +0000 UTC" firstStartedPulling="2026-03-19 11:59:29.65872021 +0000 UTC m=+327.776160997" lastFinishedPulling="2026-03-19 11:59:36.145670071 +0000 UTC m=+334.263110858" observedRunningTime="2026-03-19 11:59:37.487227295 +0000 UTC m=+335.604668132" watchObservedRunningTime="2026-03-19 11:59:37.490719158 +0000 UTC m=+335.608159945" Mar 19 11:59:37.971443 master-0 kubenswrapper[7642]: I0319 11:59:37.971388 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:37.971443 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:37.971443 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:37.971443 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:37.971894 master-0 kubenswrapper[7642]: I0319 11:59:37.971854 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:38.971543 master-0 kubenswrapper[7642]: I0319 11:59:38.971432 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:38.971543 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:38.971543 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:38.971543 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:38.972252 master-0 kubenswrapper[7642]: I0319 11:59:38.971560 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:39.972688 master-0 kubenswrapper[7642]: I0319 11:59:39.972575 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:39.972688 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:39.972688 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:39.972688 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:39.973566 master-0 kubenswrapper[7642]: I0319 11:59:39.972730 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:40.972324 master-0 kubenswrapper[7642]: I0319 11:59:40.972271 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:40.972324 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:40.972324 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:40.972324 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:40.972640 master-0 kubenswrapper[7642]: I0319 11:59:40.972345 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:41.971892 master-0 kubenswrapper[7642]: I0319 11:59:41.971836 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:41.971892 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:41.971892 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:41.971892 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:41.972627 master-0 kubenswrapper[7642]: I0319 11:59:41.971910 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:42.972093 master-0 kubenswrapper[7642]: I0319 11:59:42.971977 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:42.972093 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:42.972093 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:42.972093 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:42.972831 master-0 kubenswrapper[7642]: I0319 11:59:42.972084 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:43.972245 master-0 kubenswrapper[7642]: I0319 11:59:43.972138 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:43.972245 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:43.972245 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:43.972245 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:43.972245 master-0 kubenswrapper[7642]: I0319 11:59:43.972232 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:44.971282 master-0 kubenswrapper[7642]: I0319 11:59:44.971207 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:44.971282 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:44.971282 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:44.971282 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:44.971573 master-0 kubenswrapper[7642]: I0319 11:59:44.971301 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:45.971813 master-0 kubenswrapper[7642]: I0319 11:59:45.971735 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:45.971813 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:45.971813 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:45.971813 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:45.972349 master-0 kubenswrapper[7642]: I0319 11:59:45.971879 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:46.972688 master-0 kubenswrapper[7642]: I0319 11:59:46.972531 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:46.972688 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:46.972688 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:46.972688 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:46.974486 master-0 kubenswrapper[7642]: I0319 11:59:46.972688 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:47.971825 master-0 kubenswrapper[7642]: I0319 11:59:47.971756 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:47.971825 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:47.971825 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:47.971825 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:47.972196 master-0 kubenswrapper[7642]: I0319 11:59:47.971889 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:48.972134 master-0 kubenswrapper[7642]: I0319 11:59:48.972055 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:48.972134 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:48.972134 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:48.972134 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:48.972887 master-0 kubenswrapper[7642]: I0319 11:59:48.972236 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:49.366286 master-0 kubenswrapper[7642]: I0319 11:59:49.366113 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:49.366617 master-0 kubenswrapper[7642]: I0319 11:59:49.366417 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 11:59:49.974184 master-0 kubenswrapper[7642]: I0319 11:59:49.974067 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:49.974184 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:49.974184 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:49.974184 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:49.974184 master-0 kubenswrapper[7642]: I0319 11:59:49.974188 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:50.971702 master-0 kubenswrapper[7642]: I0319 11:59:50.971553 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:50.971702 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:50.971702 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:50.971702 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:50.971994 master-0 kubenswrapper[7642]: I0319 11:59:50.971713 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:51.971442 master-0 kubenswrapper[7642]: I0319 11:59:51.971349 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:51.971442 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:51.971442 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:51.971442 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:51.971442 master-0 kubenswrapper[7642]: I0319 11:59:51.971435 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:52.972430 master-0 kubenswrapper[7642]: I0319 11:59:52.972368 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:52.972430 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:52.972430 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:52.972430 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:52.972985 master-0 kubenswrapper[7642]: I0319 11:59:52.972430 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:53.973124 master-0 kubenswrapper[7642]: I0319 11:59:53.973009 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:53.973124 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:53.973124 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:53.973124 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:53.973124 master-0 kubenswrapper[7642]: I0319 11:59:53.973120 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:54.971613 master-0 kubenswrapper[7642]: I0319 11:59:54.971549 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:54.971613 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:54.971613 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:54.971613 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:54.971613 master-0 kubenswrapper[7642]: I0319 11:59:54.971626 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:55.972844 master-0 kubenswrapper[7642]: I0319 11:59:55.972726 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:55.972844 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:55.972844 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:55.972844 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:55.972844 master-0 kubenswrapper[7642]: I0319 11:59:55.972844 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:56.971619 master-0 kubenswrapper[7642]: I0319 11:59:56.971563 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:56.971619 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:56.971619 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:56.971619 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:56.971892 master-0 kubenswrapper[7642]: I0319 11:59:56.971646 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:57.972379 master-0 kubenswrapper[7642]: I0319 11:59:57.972299 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:57.972379 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:57.972379 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:57.972379 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:57.973240 master-0 kubenswrapper[7642]: I0319 11:59:57.972389 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:58.972029 master-0 kubenswrapper[7642]: I0319 11:59:58.971939 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:58.972029 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:58.972029 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:58.972029 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:58.973260 master-0 kubenswrapper[7642]: I0319 11:59:58.972082 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 11:59:59.971991 master-0 kubenswrapper[7642]: I0319 11:59:59.971846 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 11:59:59.971991 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 11:59:59.971991 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 11:59:59.971991 master-0 kubenswrapper[7642]: healthz check failed Mar 19 11:59:59.972284 master-0 kubenswrapper[7642]: I0319 11:59:59.972013 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:00.971675 master-0 kubenswrapper[7642]: I0319 12:00:00.971596 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:00.971675 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:00.971675 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:00.971675 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:00.972953 master-0 kubenswrapper[7642]: I0319 12:00:00.971680 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:01.972378 master-0 kubenswrapper[7642]: I0319 12:00:01.972272 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:01.972378 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:01.972378 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:01.972378 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:01.972946 master-0 kubenswrapper[7642]: I0319 12:00:01.972418 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:02.694472 master-0 kubenswrapper[7642]: I0319 12:00:02.694385 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:00:02.695122 master-0 kubenswrapper[7642]: I0319 12:00:02.695025 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="cluster-policy-controller" containerID="cri-o://203765ba18c88a4bb80613cce48098f24f5378802947c4073ee2b353a34707ac" gracePeriod=30 Mar 19 12:00:02.695171 master-0 kubenswrapper[7642]: I0319 12:00:02.695080 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" containerID="cri-o://0e6c2dae4c28b6bf9d3115747fe713bb797e4eb746fc209c257ace61169519c1" gracePeriod=30 Mar 19 12:00:02.695171 master-0 kubenswrapper[7642]: I0319 12:00:02.695114 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://1d8090aa91142bfe2d9ec6028d840d4af95fbe15328004054e0a0a8e1a71dfd6" gracePeriod=30 Mar 19 12:00:02.695230 master-0 kubenswrapper[7642]: I0319 12:00:02.695122 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://c67f49e1211ff300919ba996552c5270a8c7964fe689dec11a1c7473a2f29c42" gracePeriod=30 Mar 19 12:00:02.697357 master-0 kubenswrapper[7642]: I0319 12:00:02.697303 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:00:02.697751 master-0 kubenswrapper[7642]: E0319 12:00:02.697724 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-recovery-controller" Mar 19 12:00:02.697809 master-0 kubenswrapper[7642]: I0319 12:00:02.697756 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-recovery-controller" Mar 19 12:00:02.697809 master-0 kubenswrapper[7642]: E0319 12:00:02.697783 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" Mar 19 12:00:02.697809 master-0 kubenswrapper[7642]: I0319 12:00:02.697797 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" Mar 19 12:00:02.697902 master-0 kubenswrapper[7642]: E0319 12:00:02.697822 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="cluster-policy-controller" Mar 19 12:00:02.697902 master-0 kubenswrapper[7642]: I0319 12:00:02.697834 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="cluster-policy-controller" Mar 19 12:00:02.697902 master-0 kubenswrapper[7642]: E0319 12:00:02.697873 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-cert-syncer" Mar 19 12:00:02.697902 master-0 kubenswrapper[7642]: I0319 12:00:02.697886 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-cert-syncer" Mar 19 12:00:02.698259 master-0 kubenswrapper[7642]: I0319 12:00:02.698220 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" Mar 19 12:00:02.698259 master-0 kubenswrapper[7642]: I0319 12:00:02.698253 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="cluster-policy-controller" Mar 19 12:00:02.698338 master-0 kubenswrapper[7642]: I0319 12:00:02.698272 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-recovery-controller" Mar 19 12:00:02.698338 master-0 kubenswrapper[7642]: I0319 12:00:02.698305 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager-cert-syncer" Mar 19 12:00:02.698533 master-0 kubenswrapper[7642]: E0319 12:00:02.698513 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" Mar 19 12:00:02.698567 master-0 kubenswrapper[7642]: I0319 12:00:02.698534 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" Mar 19 12:00:02.698770 master-0 kubenswrapper[7642]: I0319 12:00:02.698744 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed5fb754bce5477588bd28f79b7bee7" containerName="kube-controller-manager" Mar 19 12:00:02.796460 master-0 kubenswrapper[7642]: I0319 12:00:02.796347 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.796460 master-0 kubenswrapper[7642]: I0319 12:00:02.796448 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.865923 master-0 kubenswrapper[7642]: I0319 12:00:02.865868 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager-cert-syncer/0.log" Mar 19 12:00:02.866605 master-0 kubenswrapper[7642]: I0319 12:00:02.866578 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager/0.log" Mar 19 12:00:02.866707 master-0 kubenswrapper[7642]: I0319 12:00:02.866692 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.871130 master-0 kubenswrapper[7642]: I0319 12:00:02.871020 7642 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2ed5fb754bce5477588bd28f79b7bee7" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:00:02.897732 master-0 kubenswrapper[7642]: I0319 12:00:02.897677 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.897732 master-0 kubenswrapper[7642]: I0319 12:00:02.897739 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.897937 master-0 kubenswrapper[7642]: I0319 12:00:02.897850 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.897937 master-0 kubenswrapper[7642]: I0319 12:00:02.897891 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:02.971207 master-0 kubenswrapper[7642]: I0319 12:00:02.971042 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:02.971207 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:02.971207 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:02.971207 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:02.971207 master-0 kubenswrapper[7642]: I0319 12:00:02.971123 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:02.998620 master-0 kubenswrapper[7642]: I0319 12:00:02.998357 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-resource-dir\") pod \"2ed5fb754bce5477588bd28f79b7bee7\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " Mar 19 12:00:02.998620 master-0 kubenswrapper[7642]: I0319 12:00:02.998451 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "2ed5fb754bce5477588bd28f79b7bee7" (UID: "2ed5fb754bce5477588bd28f79b7bee7"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:00:02.998620 master-0 kubenswrapper[7642]: I0319 12:00:02.998476 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-cert-dir\") pod \"2ed5fb754bce5477588bd28f79b7bee7\" (UID: \"2ed5fb754bce5477588bd28f79b7bee7\") " Mar 19 12:00:02.998620 master-0 kubenswrapper[7642]: I0319 12:00:02.998549 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "2ed5fb754bce5477588bd28f79b7bee7" (UID: "2ed5fb754bce5477588bd28f79b7bee7"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:00:03.000095 master-0 kubenswrapper[7642]: I0319 12:00:02.999521 7642 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:03.000095 master-0 kubenswrapper[7642]: I0319 12:00:02.999561 7642 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2ed5fb754bce5477588bd28f79b7bee7-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:03.640402 master-0 kubenswrapper[7642]: I0319 12:00:03.640318 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager-cert-syncer/0.log" Mar 19 12:00:03.641802 master-0 kubenswrapper[7642]: I0319 12:00:03.641747 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager/0.log" Mar 19 12:00:03.641977 master-0 kubenswrapper[7642]: I0319 12:00:03.641841 7642 generic.go:334] "Generic (PLEG): container finished" podID="2ed5fb754bce5477588bd28f79b7bee7" containerID="0e6c2dae4c28b6bf9d3115747fe713bb797e4eb746fc209c257ace61169519c1" exitCode=0 Mar 19 12:00:03.641977 master-0 kubenswrapper[7642]: I0319 12:00:03.641902 7642 generic.go:334] "Generic (PLEG): container finished" podID="2ed5fb754bce5477588bd28f79b7bee7" containerID="c67f49e1211ff300919ba996552c5270a8c7964fe689dec11a1c7473a2f29c42" exitCode=0 Mar 19 12:00:03.641977 master-0 kubenswrapper[7642]: I0319 12:00:03.641924 7642 generic.go:334] "Generic (PLEG): container finished" podID="2ed5fb754bce5477588bd28f79b7bee7" containerID="1d8090aa91142bfe2d9ec6028d840d4af95fbe15328004054e0a0a8e1a71dfd6" exitCode=2 Mar 19 12:00:03.641977 master-0 kubenswrapper[7642]: I0319 12:00:03.641943 7642 generic.go:334] "Generic (PLEG): container finished" podID="2ed5fb754bce5477588bd28f79b7bee7" containerID="203765ba18c88a4bb80613cce48098f24f5378802947c4073ee2b353a34707ac" exitCode=0 Mar 19 12:00:03.642374 master-0 kubenswrapper[7642]: I0319 12:00:03.642050 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:03.642374 master-0 kubenswrapper[7642]: I0319 12:00:03.642132 7642 scope.go:117] "RemoveContainer" containerID="9b8698d443a661dc8c25e79e56fd0a5be94da1dd2f3e5548ce87e282a4e8855f" Mar 19 12:00:03.642374 master-0 kubenswrapper[7642]: I0319 12:00:03.642080 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7146690142f28d4686e2cc67fa293c2eaa3b3b7192da93637dc526fe5b56f361" Mar 19 12:00:03.645363 master-0 kubenswrapper[7642]: I0319 12:00:03.645184 7642 generic.go:334] "Generic (PLEG): container finished" podID="1ec99218-1644-461e-abb2-14d0c5369e96" containerID="16da51ad34fc7222c4ea68656e0145a3d1eba549a2e561b21ed00a07283c919f" exitCode=0 Mar 19 12:00:03.645532 master-0 kubenswrapper[7642]: I0319 12:00:03.645308 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"1ec99218-1644-461e-abb2-14d0c5369e96","Type":"ContainerDied","Data":"16da51ad34fc7222c4ea68656e0145a3d1eba549a2e561b21ed00a07283c919f"} Mar 19 12:00:03.646581 master-0 kubenswrapper[7642]: I0319 12:00:03.646502 7642 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2ed5fb754bce5477588bd28f79b7bee7" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:00:03.678723 master-0 kubenswrapper[7642]: I0319 12:00:03.678663 7642 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="2ed5fb754bce5477588bd28f79b7bee7" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:00:03.972300 master-0 kubenswrapper[7642]: I0319 12:00:03.972219 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:03.972300 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:03.972300 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:03.972300 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:03.972805 master-0 kubenswrapper[7642]: I0319 12:00:03.972341 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:04.082100 master-0 kubenswrapper[7642]: I0319 12:00:04.081994 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed5fb754bce5477588bd28f79b7bee7" path="/var/lib/kubelet/pods/2ed5fb754bce5477588bd28f79b7bee7/volumes" Mar 19 12:00:04.659934 master-0 kubenswrapper[7642]: I0319 12:00:04.659877 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_2ed5fb754bce5477588bd28f79b7bee7/kube-controller-manager-cert-syncer/0.log" Mar 19 12:00:04.971008 master-0 kubenswrapper[7642]: I0319 12:00:04.970716 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:04.971008 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:04.971008 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:04.971008 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:04.971008 master-0 kubenswrapper[7642]: I0319 12:00:04.970813 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:05.009592 master-0 kubenswrapper[7642]: I0319 12:00:05.009540 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:00:05.132033 master-0 kubenswrapper[7642]: I0319 12:00:05.131970 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-kubelet-dir\") pod \"1ec99218-1644-461e-abb2-14d0c5369e96\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " Mar 19 12:00:05.132686 master-0 kubenswrapper[7642]: I0319 12:00:05.132140 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec99218-1644-461e-abb2-14d0c5369e96-kube-api-access\") pod \"1ec99218-1644-461e-abb2-14d0c5369e96\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " Mar 19 12:00:05.132686 master-0 kubenswrapper[7642]: I0319 12:00:05.132212 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-var-lock\") pod \"1ec99218-1644-461e-abb2-14d0c5369e96\" (UID: \"1ec99218-1644-461e-abb2-14d0c5369e96\") " Mar 19 12:00:05.132686 master-0 kubenswrapper[7642]: I0319 12:00:05.132252 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1ec99218-1644-461e-abb2-14d0c5369e96" (UID: "1ec99218-1644-461e-abb2-14d0c5369e96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:00:05.132869 master-0 kubenswrapper[7642]: I0319 12:00:05.132689 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:05.132869 master-0 kubenswrapper[7642]: I0319 12:00:05.132757 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-var-lock" (OuterVolumeSpecName: "var-lock") pod "1ec99218-1644-461e-abb2-14d0c5369e96" (UID: "1ec99218-1644-461e-abb2-14d0c5369e96"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:00:05.137369 master-0 kubenswrapper[7642]: I0319 12:00:05.137321 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec99218-1644-461e-abb2-14d0c5369e96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1ec99218-1644-461e-abb2-14d0c5369e96" (UID: "1ec99218-1644-461e-abb2-14d0c5369e96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:00:05.234563 master-0 kubenswrapper[7642]: I0319 12:00:05.234350 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ec99218-1644-461e-abb2-14d0c5369e96-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:05.234563 master-0 kubenswrapper[7642]: I0319 12:00:05.234417 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1ec99218-1644-461e-abb2-14d0c5369e96-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:00:05.672169 master-0 kubenswrapper[7642]: I0319 12:00:05.672086 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"1ec99218-1644-461e-abb2-14d0c5369e96","Type":"ContainerDied","Data":"f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479"} Mar 19 12:00:05.672169 master-0 kubenswrapper[7642]: I0319 12:00:05.672151 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479" Mar 19 12:00:05.672169 master-0 kubenswrapper[7642]: I0319 12:00:05.672155 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:00:05.972444 master-0 kubenswrapper[7642]: I0319 12:00:05.972290 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:05.972444 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:05.972444 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:05.972444 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:05.972762 master-0 kubenswrapper[7642]: I0319 12:00:05.972468 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:06.971863 master-0 kubenswrapper[7642]: I0319 12:00:06.971771 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:06.971863 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:06.971863 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:06.971863 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:06.973398 master-0 kubenswrapper[7642]: I0319 12:00:06.971896 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:07.972201 master-0 kubenswrapper[7642]: I0319 12:00:07.972125 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:07.972201 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:07.972201 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:07.972201 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:07.973425 master-0 kubenswrapper[7642]: I0319 12:00:07.972803 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:08.971809 master-0 kubenswrapper[7642]: I0319 12:00:08.971738 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:08.971809 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:08.971809 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:08.971809 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:08.971809 master-0 kubenswrapper[7642]: I0319 12:00:08.971802 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:09.373750 master-0 kubenswrapper[7642]: I0319 12:00:09.373590 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:00:09.378023 master-0 kubenswrapper[7642]: I0319 12:00:09.377984 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:00:09.972180 master-0 kubenswrapper[7642]: I0319 12:00:09.972100 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:09.972180 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:09.972180 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:09.972180 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:09.973223 master-0 kubenswrapper[7642]: I0319 12:00:09.972184 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:10.972340 master-0 kubenswrapper[7642]: I0319 12:00:10.972274 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:10.972340 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:10.972340 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:10.972340 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:10.972964 master-0 kubenswrapper[7642]: I0319 12:00:10.972366 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:11.971065 master-0 kubenswrapper[7642]: I0319 12:00:11.970985 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:11.971065 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:11.971065 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:11.971065 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:11.971351 master-0 kubenswrapper[7642]: I0319 12:00:11.971070 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:12.973110 master-0 kubenswrapper[7642]: I0319 12:00:12.973016 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:12.973110 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:12.973110 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:12.973110 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:12.973959 master-0 kubenswrapper[7642]: I0319 12:00:12.973118 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:13.972219 master-0 kubenswrapper[7642]: I0319 12:00:13.972149 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:13.972219 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:13.972219 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:13.972219 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:13.972560 master-0 kubenswrapper[7642]: I0319 12:00:13.972242 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:14.971860 master-0 kubenswrapper[7642]: I0319 12:00:14.971781 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:14.971860 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:14.971860 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:14.971860 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:14.985194 master-0 kubenswrapper[7642]: I0319 12:00:14.971883 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:15.067241 master-0 kubenswrapper[7642]: I0319 12:00:15.067140 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:15.098977 master-0 kubenswrapper[7642]: I0319 12:00:15.098906 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6df74668-b47a-4fbb-91c0-9d2784af5027" Mar 19 12:00:15.098977 master-0 kubenswrapper[7642]: I0319 12:00:15.098967 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="6df74668-b47a-4fbb-91c0-9d2784af5027" Mar 19 12:00:15.120319 master-0 kubenswrapper[7642]: I0319 12:00:15.120173 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:00:15.120858 master-0 kubenswrapper[7642]: I0319 12:00:15.120819 7642 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:15.129162 master-0 kubenswrapper[7642]: I0319 12:00:15.129056 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:00:15.134380 master-0 kubenswrapper[7642]: I0319 12:00:15.134315 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:15.137216 master-0 kubenswrapper[7642]: I0319 12:00:15.137170 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:00:15.173504 master-0 kubenswrapper[7642]: W0319 12:00:15.173454 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91ced7d9a1e8a2019ac2b6a262f0d19e.slice/crio-088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc WatchSource:0}: Error finding container 088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc: Status 404 returned error can't find the container with id 088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc Mar 19 12:00:15.743743 master-0 kubenswrapper[7642]: I0319 12:00:15.743680 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"413966a39cd5b87df5d6bdc132b164ee7acc47dac3064d44c6d8a65871b50be7"} Mar 19 12:00:15.743743 master-0 kubenswrapper[7642]: I0319 12:00:15.743739 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f"} Mar 19 12:00:15.743743 master-0 kubenswrapper[7642]: I0319 12:00:15.743757 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc"} Mar 19 12:00:15.972497 master-0 kubenswrapper[7642]: I0319 12:00:15.972425 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:15.972497 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:15.972497 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:15.972497 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:15.972497 master-0 kubenswrapper[7642]: I0319 12:00:15.972496 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:16.771121 master-0 kubenswrapper[7642]: I0319 12:00:16.771045 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"61ee084e983f277942e90341d08d9836dacf57a63fdca8d267638b871e818f19"} Mar 19 12:00:16.771121 master-0 kubenswrapper[7642]: I0319 12:00:16.771118 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b"} Mar 19 12:00:16.817986 master-0 kubenswrapper[7642]: I0319 12:00:16.817869 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.817846246 podStartE2EDuration="1.817846246s" podCreationTimestamp="2026-03-19 12:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:00:16.815268661 +0000 UTC m=+374.932709448" watchObservedRunningTime="2026-03-19 12:00:16.817846246 +0000 UTC m=+374.935287043" Mar 19 12:00:16.971574 master-0 kubenswrapper[7642]: I0319 12:00:16.971466 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:16.971574 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:16.971574 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:16.971574 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:16.971574 master-0 kubenswrapper[7642]: I0319 12:00:16.971557 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:17.972032 master-0 kubenswrapper[7642]: I0319 12:00:17.971943 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:17.972032 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:17.972032 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:17.972032 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:17.973328 master-0 kubenswrapper[7642]: I0319 12:00:17.972066 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:18.972185 master-0 kubenswrapper[7642]: I0319 12:00:18.972076 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:18.972185 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:18.972185 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:18.972185 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:18.972185 master-0 kubenswrapper[7642]: I0319 12:00:18.972178 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:19.972540 master-0 kubenswrapper[7642]: I0319 12:00:19.972488 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:19.972540 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:19.972540 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:19.972540 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:19.972985 master-0 kubenswrapper[7642]: I0319 12:00:19.972573 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:20.971663 master-0 kubenswrapper[7642]: I0319 12:00:20.971584 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:20.971663 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:20.971663 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:20.971663 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:20.972007 master-0 kubenswrapper[7642]: I0319 12:00:20.971687 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:21.971698 master-0 kubenswrapper[7642]: I0319 12:00:21.971606 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:21.971698 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:21.971698 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:21.971698 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:21.972502 master-0 kubenswrapper[7642]: I0319 12:00:21.971728 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:22.971053 master-0 kubenswrapper[7642]: I0319 12:00:22.970968 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:22.971053 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:22.971053 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:22.971053 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:22.971053 master-0 kubenswrapper[7642]: I0319 12:00:22.971033 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:23.971443 master-0 kubenswrapper[7642]: I0319 12:00:23.971361 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:23.971443 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:23.971443 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:23.971443 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:23.972885 master-0 kubenswrapper[7642]: I0319 12:00:23.971474 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:24.972546 master-0 kubenswrapper[7642]: I0319 12:00:24.972488 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:24.972546 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:24.972546 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:24.972546 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:24.973359 master-0 kubenswrapper[7642]: I0319 12:00:24.973322 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:25.134849 master-0 kubenswrapper[7642]: I0319 12:00:25.134742 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.134849 master-0 kubenswrapper[7642]: I0319 12:00:25.134833 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.135209 master-0 kubenswrapper[7642]: I0319 12:00:25.134921 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.135209 master-0 kubenswrapper[7642]: I0319 12:00:25.135135 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.139362 master-0 kubenswrapper[7642]: I0319 12:00:25.139301 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.139804 master-0 kubenswrapper[7642]: I0319 12:00:25.139781 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.901405 master-0 kubenswrapper[7642]: I0319 12:00:25.901296 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.901816 master-0 kubenswrapper[7642]: I0319 12:00:25.901764 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:00:25.972893 master-0 kubenswrapper[7642]: I0319 12:00:25.972793 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:25.972893 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:25.972893 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:25.972893 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:25.973754 master-0 kubenswrapper[7642]: I0319 12:00:25.972901 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:26.970924 master-0 kubenswrapper[7642]: I0319 12:00:26.970829 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:26.970924 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:26.970924 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:26.970924 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:26.971371 master-0 kubenswrapper[7642]: I0319 12:00:26.970957 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:27.971691 master-0 kubenswrapper[7642]: I0319 12:00:27.971598 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:27.971691 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:27.971691 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:27.971691 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:27.972472 master-0 kubenswrapper[7642]: I0319 12:00:27.971714 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:28.971619 master-0 kubenswrapper[7642]: I0319 12:00:28.971526 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:28.971619 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:28.971619 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:28.971619 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:28.972277 master-0 kubenswrapper[7642]: I0319 12:00:28.971648 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:29.972867 master-0 kubenswrapper[7642]: I0319 12:00:29.972790 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:29.972867 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:29.972867 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:29.972867 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:29.973801 master-0 kubenswrapper[7642]: I0319 12:00:29.972894 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:30.971840 master-0 kubenswrapper[7642]: I0319 12:00:30.971758 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:30.971840 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:30.971840 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:30.971840 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:30.972265 master-0 kubenswrapper[7642]: I0319 12:00:30.971849 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:31.975949 master-0 kubenswrapper[7642]: I0319 12:00:31.975849 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:31.975949 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:31.975949 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:31.975949 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:31.975949 master-0 kubenswrapper[7642]: I0319 12:00:31.975931 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:32.975385 master-0 kubenswrapper[7642]: I0319 12:00:32.975323 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:32.975385 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:32.975385 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:32.975385 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:32.975876 master-0 kubenswrapper[7642]: I0319 12:00:32.975417 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:33.972528 master-0 kubenswrapper[7642]: I0319 12:00:33.972480 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:33.972528 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:33.972528 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:33.972528 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:33.973150 master-0 kubenswrapper[7642]: I0319 12:00:33.972548 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:34.972547 master-0 kubenswrapper[7642]: I0319 12:00:34.972450 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:34.972547 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:34.972547 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:34.972547 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:34.973176 master-0 kubenswrapper[7642]: I0319 12:00:34.972740 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:35.971719 master-0 kubenswrapper[7642]: I0319 12:00:35.971626 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:35.971719 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:35.971719 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:35.971719 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:35.971719 master-0 kubenswrapper[7642]: I0319 12:00:35.971712 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:36.971437 master-0 kubenswrapper[7642]: I0319 12:00:36.971364 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:36.971437 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:36.971437 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:36.971437 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:36.972170 master-0 kubenswrapper[7642]: I0319 12:00:36.971449 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:37.972064 master-0 kubenswrapper[7642]: I0319 12:00:37.971988 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:37.972064 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:37.972064 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:37.972064 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:37.972799 master-0 kubenswrapper[7642]: I0319 12:00:37.972084 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:38.972046 master-0 kubenswrapper[7642]: I0319 12:00:38.971969 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:38.972046 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:38.972046 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:38.972046 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:38.972764 master-0 kubenswrapper[7642]: I0319 12:00:38.972074 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:39.972383 master-0 kubenswrapper[7642]: I0319 12:00:39.972011 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:39.972383 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:39.972383 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:39.972383 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:39.972383 master-0 kubenswrapper[7642]: I0319 12:00:39.972101 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:40.973088 master-0 kubenswrapper[7642]: I0319 12:00:40.972987 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:40.973088 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:40.973088 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:40.973088 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:40.973873 master-0 kubenswrapper[7642]: I0319 12:00:40.973096 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:41.973091 master-0 kubenswrapper[7642]: I0319 12:00:41.972984 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:41.973091 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:41.973091 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:41.973091 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:41.974018 master-0 kubenswrapper[7642]: I0319 12:00:41.973130 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:42.971761 master-0 kubenswrapper[7642]: I0319 12:00:42.971699 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:42.971761 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:42.971761 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:42.971761 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:42.972050 master-0 kubenswrapper[7642]: I0319 12:00:42.971790 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:43.972125 master-0 kubenswrapper[7642]: I0319 12:00:43.972065 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:43.972125 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:43.972125 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:43.972125 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:43.972941 master-0 kubenswrapper[7642]: I0319 12:00:43.972909 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:44.971022 master-0 kubenswrapper[7642]: I0319 12:00:44.970977 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:44.971022 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:44.971022 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:44.971022 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:44.971417 master-0 kubenswrapper[7642]: I0319 12:00:44.971036 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:45.971510 master-0 kubenswrapper[7642]: I0319 12:00:45.971398 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:45.971510 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:45.971510 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:45.971510 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:45.971510 master-0 kubenswrapper[7642]: I0319 12:00:45.971497 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:46.971698 master-0 kubenswrapper[7642]: I0319 12:00:46.971557 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:46.971698 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:46.971698 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:46.971698 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:46.972356 master-0 kubenswrapper[7642]: I0319 12:00:46.971785 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:47.972883 master-0 kubenswrapper[7642]: I0319 12:00:47.972800 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:47.972883 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:47.972883 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:47.972883 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:47.973794 master-0 kubenswrapper[7642]: I0319 12:00:47.972991 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:48.971896 master-0 kubenswrapper[7642]: I0319 12:00:48.971826 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:48.971896 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:48.971896 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:48.971896 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:48.972233 master-0 kubenswrapper[7642]: I0319 12:00:48.971913 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:49.971484 master-0 kubenswrapper[7642]: I0319 12:00:49.971386 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:49.971484 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:49.971484 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:49.971484 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:49.972056 master-0 kubenswrapper[7642]: I0319 12:00:49.971525 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:50.972737 master-0 kubenswrapper[7642]: I0319 12:00:50.972676 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:50.972737 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:50.972737 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:50.972737 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:50.973871 master-0 kubenswrapper[7642]: I0319 12:00:50.972747 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:51.971994 master-0 kubenswrapper[7642]: I0319 12:00:51.971894 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:51.971994 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:51.971994 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:51.971994 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:51.972526 master-0 kubenswrapper[7642]: I0319 12:00:51.972009 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:52.971384 master-0 kubenswrapper[7642]: I0319 12:00:52.971318 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:52.971384 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:52.971384 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:52.971384 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:52.972030 master-0 kubenswrapper[7642]: I0319 12:00:52.971400 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:53.972174 master-0 kubenswrapper[7642]: I0319 12:00:53.972120 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:53.972174 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:53.972174 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:53.972174 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:53.973056 master-0 kubenswrapper[7642]: I0319 12:00:53.973025 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:54.971734 master-0 kubenswrapper[7642]: I0319 12:00:54.971668 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:54.971734 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:54.971734 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:54.971734 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:54.972031 master-0 kubenswrapper[7642]: I0319 12:00:54.971762 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:55.972319 master-0 kubenswrapper[7642]: I0319 12:00:55.972243 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:55.972319 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:55.972319 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:55.972319 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:55.973373 master-0 kubenswrapper[7642]: I0319 12:00:55.972367 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:56.971935 master-0 kubenswrapper[7642]: I0319 12:00:56.971863 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:56.971935 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:56.971935 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:56.971935 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:56.971935 master-0 kubenswrapper[7642]: I0319 12:00:56.971953 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:57.975788 master-0 kubenswrapper[7642]: I0319 12:00:57.975718 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:57.975788 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:57.975788 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:57.975788 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:57.976617 master-0 kubenswrapper[7642]: I0319 12:00:57.975822 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:58.972222 master-0 kubenswrapper[7642]: I0319 12:00:58.972133 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:58.972222 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:58.972222 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:58.972222 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:58.972545 master-0 kubenswrapper[7642]: I0319 12:00:58.972244 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:00:59.973665 master-0 kubenswrapper[7642]: I0319 12:00:59.972855 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:00:59.973665 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:00:59.973665 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:00:59.973665 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:00:59.974984 master-0 kubenswrapper[7642]: I0319 12:00:59.973711 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:00.971414 master-0 kubenswrapper[7642]: I0319 12:01:00.971332 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:00.971414 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:00.971414 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:00.971414 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:00.971778 master-0 kubenswrapper[7642]: I0319 12:01:00.971439 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:01.970808 master-0 kubenswrapper[7642]: I0319 12:01:01.970743 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:01.970808 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:01.970808 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:01.970808 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:01.971441 master-0 kubenswrapper[7642]: I0319 12:01:01.970830 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:02.971942 master-0 kubenswrapper[7642]: I0319 12:01:02.971890 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:02.971942 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:02.971942 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:02.971942 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:02.972582 master-0 kubenswrapper[7642]: I0319 12:01:02.972555 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:03.971783 master-0 kubenswrapper[7642]: I0319 12:01:03.971714 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:03.971783 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:03.971783 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:03.971783 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:03.972536 master-0 kubenswrapper[7642]: I0319 12:01:03.971789 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:04.972039 master-0 kubenswrapper[7642]: I0319 12:01:04.971954 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:04.972039 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:04.972039 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:04.972039 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:04.972834 master-0 kubenswrapper[7642]: I0319 12:01:04.972072 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:05.972173 master-0 kubenswrapper[7642]: I0319 12:01:05.972075 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:05.972173 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:05.972173 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:05.972173 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:05.973266 master-0 kubenswrapper[7642]: I0319 12:01:05.972190 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:06.327266 master-0 kubenswrapper[7642]: I0319 12:01:06.327133 7642 scope.go:117] "RemoveContainer" containerID="7df327773d76aaacf805979906c8e4aa2f22ce6b1887fb5d42df0fd26e17a8b7" Mar 19 12:01:06.348775 master-0 kubenswrapper[7642]: I0319 12:01:06.348701 7642 scope.go:117] "RemoveContainer" containerID="a6e72fa780304dc2b33dbdfd17546cbc3d37632700728a439fd482435edf60d1" Mar 19 12:01:06.972698 master-0 kubenswrapper[7642]: I0319 12:01:06.972602 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:06.972698 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:06.972698 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:06.972698 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:06.973811 master-0 kubenswrapper[7642]: I0319 12:01:06.972732 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:07.971737 master-0 kubenswrapper[7642]: I0319 12:01:07.971662 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:07.971737 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:07.971737 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:07.971737 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:07.972161 master-0 kubenswrapper[7642]: I0319 12:01:07.971764 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:08.973066 master-0 kubenswrapper[7642]: I0319 12:01:08.972937 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:08.973066 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:08.973066 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:08.973066 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:08.973066 master-0 kubenswrapper[7642]: I0319 12:01:08.973091 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:09.973781 master-0 kubenswrapper[7642]: I0319 12:01:09.973659 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:09.973781 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:09.973781 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:09.973781 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:09.975140 master-0 kubenswrapper[7642]: I0319 12:01:09.973830 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:10.972029 master-0 kubenswrapper[7642]: I0319 12:01:10.971924 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:10.972029 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:10.972029 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:10.972029 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:10.972510 master-0 kubenswrapper[7642]: I0319 12:01:10.972040 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:11.972448 master-0 kubenswrapper[7642]: I0319 12:01:11.972358 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:11.972448 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:11.972448 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:11.972448 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:11.973425 master-0 kubenswrapper[7642]: I0319 12:01:11.972470 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:12.971923 master-0 kubenswrapper[7642]: I0319 12:01:12.971847 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:12.971923 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:12.971923 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:12.971923 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:12.972294 master-0 kubenswrapper[7642]: I0319 12:01:12.971925 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:13.972092 master-0 kubenswrapper[7642]: I0319 12:01:13.972028 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:13.972092 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:13.972092 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:13.972092 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:13.972692 master-0 kubenswrapper[7642]: I0319 12:01:13.972115 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:14.972942 master-0 kubenswrapper[7642]: I0319 12:01:14.972780 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:14.972942 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:14.972942 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:14.972942 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:14.972942 master-0 kubenswrapper[7642]: I0319 12:01:14.972866 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:15.971987 master-0 kubenswrapper[7642]: I0319 12:01:15.971895 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:01:15.971987 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:01:15.971987 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:01:15.971987 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:01:15.971987 master-0 kubenswrapper[7642]: I0319 12:01:15.971992 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:01:15.972479 master-0 kubenswrapper[7642]: I0319 12:01:15.972065 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:01:15.973078 master-0 kubenswrapper[7642]: I0319 12:01:15.973025 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"12c2dd42c42f12d98f0581174f1fb6e51a5a2951700534c282b1aaa4cb3c5149"} pod="openshift-ingress/router-default-7dcf5569b5-47tqf" containerMessage="Container router failed startup probe, will be restarted" Mar 19 12:01:15.973562 master-0 kubenswrapper[7642]: I0319 12:01:15.973099 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" containerID="cri-o://12c2dd42c42f12d98f0581174f1fb6e51a5a2951700534c282b1aaa4cb3c5149" gracePeriod=3600 Mar 19 12:01:19.282909 master-0 kubenswrapper[7642]: I0319 12:01:19.282806 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/2.log" Mar 19 12:01:19.284044 master-0 kubenswrapper[7642]: I0319 12:01:19.284017 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/1.log" Mar 19 12:01:19.284429 master-0 kubenswrapper[7642]: I0319 12:01:19.284403 7642 generic.go:334] "Generic (PLEG): container finished" podID="f38df702-a256-4f83-90bb-0c0e477a2ce6" containerID="39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad" exitCode=1 Mar 19 12:01:19.284540 master-0 kubenswrapper[7642]: I0319 12:01:19.284443 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerDied","Data":"39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad"} Mar 19 12:01:19.284668 master-0 kubenswrapper[7642]: I0319 12:01:19.284654 7642 scope.go:117] "RemoveContainer" containerID="7316631a0f45a2c1ef1105021be26127f5494db22ff552f0eb4f6711db06819f" Mar 19 12:01:19.286727 master-0 kubenswrapper[7642]: I0319 12:01:19.286490 7642 scope.go:117] "RemoveContainer" containerID="39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad" Mar 19 12:01:19.286819 master-0 kubenswrapper[7642]: E0319 12:01:19.286796 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:01:20.293345 master-0 kubenswrapper[7642]: I0319 12:01:20.293307 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/2.log" Mar 19 12:01:34.068292 master-0 kubenswrapper[7642]: I0319 12:01:34.068223 7642 scope.go:117] "RemoveContainer" containerID="39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad" Mar 19 12:01:34.069327 master-0 kubenswrapper[7642]: E0319 12:01:34.068489 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:01:48.068832 master-0 kubenswrapper[7642]: I0319 12:01:48.068758 7642 scope.go:117] "RemoveContainer" containerID="39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad" Mar 19 12:01:48.838950 master-0 kubenswrapper[7642]: I0319 12:01:48.838858 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/2.log" Mar 19 12:01:48.839403 master-0 kubenswrapper[7642]: I0319 12:01:48.839365 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42"} Mar 19 12:01:51.043490 master-0 kubenswrapper[7642]: I0319 12:01:51.043403 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 12:01:51.044270 master-0 kubenswrapper[7642]: E0319 12:01:51.043712 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec99218-1644-461e-abb2-14d0c5369e96" containerName="installer" Mar 19 12:01:51.044270 master-0 kubenswrapper[7642]: I0319 12:01:51.043729 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec99218-1644-461e-abb2-14d0c5369e96" containerName="installer" Mar 19 12:01:51.044270 master-0 kubenswrapper[7642]: I0319 12:01:51.043856 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec99218-1644-461e-abb2-14d0c5369e96" containerName="installer" Mar 19 12:01:51.044362 master-0 kubenswrapper[7642]: I0319 12:01:51.044312 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.046674 master-0 kubenswrapper[7642]: I0319 12:01:51.046612 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-fbfh9" Mar 19 12:01:51.047157 master-0 kubenswrapper[7642]: I0319 12:01:51.047134 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 19 12:01:51.062187 master-0 kubenswrapper[7642]: I0319 12:01:51.062114 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 12:01:51.238019 master-0 kubenswrapper[7642]: I0319 12:01:51.237941 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.238388 master-0 kubenswrapper[7642]: I0319 12:01:51.238074 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kube-api-access\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.238388 master-0 kubenswrapper[7642]: I0319 12:01:51.238102 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-var-lock\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.339296 master-0 kubenswrapper[7642]: I0319 12:01:51.339081 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.339296 master-0 kubenswrapper[7642]: I0319 12:01:51.339206 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kube-api-access\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.339296 master-0 kubenswrapper[7642]: I0319 12:01:51.339239 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-var-lock\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.339892 master-0 kubenswrapper[7642]: I0319 12:01:51.339262 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.339892 master-0 kubenswrapper[7642]: I0319 12:01:51.339428 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-var-lock\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.361619 master-0 kubenswrapper[7642]: I0319 12:01:51.361545 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kube-api-access\") pod \"installer-5-master-0\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.390716 master-0 kubenswrapper[7642]: I0319 12:01:51.390622 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:01:51.865256 master-0 kubenswrapper[7642]: I0319 12:01:51.865188 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 19 12:01:51.873618 master-0 kubenswrapper[7642]: W0319 12:01:51.873555 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod72c2ed5a_4b8d_4528_a230_fe608ea55ad3.slice/crio-7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17 WatchSource:0}: Error finding container 7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17: Status 404 returned error can't find the container with id 7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17 Mar 19 12:01:52.867805 master-0 kubenswrapper[7642]: I0319 12:01:52.867735 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"72c2ed5a-4b8d-4528-a230-fe608ea55ad3","Type":"ContainerStarted","Data":"f7e4a7cf437941fedc829554051c6d66a149aed6e58ff2aaeede0de923f84868"} Mar 19 12:01:52.867805 master-0 kubenswrapper[7642]: I0319 12:01:52.867815 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"72c2ed5a-4b8d-4528-a230-fe608ea55ad3","Type":"ContainerStarted","Data":"7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17"} Mar 19 12:01:52.895632 master-0 kubenswrapper[7642]: I0319 12:01:52.895567 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=1.895549202 podStartE2EDuration="1.895549202s" podCreationTimestamp="2026-03-19 12:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:01:52.892416462 +0000 UTC m=+471.009857249" watchObservedRunningTime="2026-03-19 12:01:52.895549202 +0000 UTC m=+471.012989989" Mar 19 12:02:02.941249 master-0 kubenswrapper[7642]: I0319 12:02:02.941169 7642 generic.go:334] "Generic (PLEG): container finished" podID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerID="12c2dd42c42f12d98f0581174f1fb6e51a5a2951700534c282b1aaa4cb3c5149" exitCode=0 Mar 19 12:02:02.941249 master-0 kubenswrapper[7642]: I0319 12:02:02.941233 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerDied","Data":"12c2dd42c42f12d98f0581174f1fb6e51a5a2951700534c282b1aaa4cb3c5149"} Mar 19 12:02:02.941249 master-0 kubenswrapper[7642]: I0319 12:02:02.941272 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"4d73ea74c9f238cd74823cfff5517f40d001e3ed453b6b3cd60056853897aa07"} Mar 19 12:02:02.968932 master-0 kubenswrapper[7642]: I0319 12:02:02.968853 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:02:02.968932 master-0 kubenswrapper[7642]: I0319 12:02:02.968930 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:02:02.971874 master-0 kubenswrapper[7642]: I0319 12:02:02.971815 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:02.971874 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:02.971874 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:02.971874 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:02.972141 master-0 kubenswrapper[7642]: I0319 12:02:02.971920 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:03.971163 master-0 kubenswrapper[7642]: I0319 12:02:03.971095 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:03.971163 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:03.971163 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:03.971163 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:03.971888 master-0 kubenswrapper[7642]: I0319 12:02:03.971181 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:04.971507 master-0 kubenswrapper[7642]: I0319 12:02:04.971404 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:04.971507 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:04.971507 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:04.971507 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:04.971507 master-0 kubenswrapper[7642]: I0319 12:02:04.971501 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:05.971973 master-0 kubenswrapper[7642]: I0319 12:02:05.971872 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:05.971973 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:05.971973 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:05.971973 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:05.972686 master-0 kubenswrapper[7642]: I0319 12:02:05.971987 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:06.394912 master-0 kubenswrapper[7642]: I0319 12:02:06.394723 7642 scope.go:117] "RemoveContainer" containerID="9704f6068bbc3b7071c181549835a51a1db8ec178675b82fbcbeb90ed4c62c0b" Mar 19 12:02:06.976076 master-0 kubenswrapper[7642]: I0319 12:02:06.975999 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:06.976076 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:06.976076 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:06.976076 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:06.976890 master-0 kubenswrapper[7642]: I0319 12:02:06.976090 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:07.972331 master-0 kubenswrapper[7642]: I0319 12:02:07.972237 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:07.972331 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:07.972331 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:07.972331 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:07.973374 master-0 kubenswrapper[7642]: I0319 12:02:07.972384 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:08.973356 master-0 kubenswrapper[7642]: I0319 12:02:08.973288 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:08.973356 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:08.973356 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:08.973356 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:08.975607 master-0 kubenswrapper[7642]: I0319 12:02:08.974046 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:09.971621 master-0 kubenswrapper[7642]: I0319 12:02:09.971557 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:09.971621 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:09.971621 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:09.971621 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:09.972063 master-0 kubenswrapper[7642]: I0319 12:02:09.971686 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:10.971520 master-0 kubenswrapper[7642]: I0319 12:02:10.971438 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:10.971520 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:10.971520 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:10.971520 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:10.972286 master-0 kubenswrapper[7642]: I0319 12:02:10.971581 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:11.972134 master-0 kubenswrapper[7642]: I0319 12:02:11.972054 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:11.972134 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:11.972134 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:11.972134 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:11.972917 master-0 kubenswrapper[7642]: I0319 12:02:11.972169 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:12.974396 master-0 kubenswrapper[7642]: I0319 12:02:12.974277 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:12.974396 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:12.974396 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:12.974396 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:12.974396 master-0 kubenswrapper[7642]: I0319 12:02:12.974388 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:13.972626 master-0 kubenswrapper[7642]: I0319 12:02:13.972539 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:13.972626 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:13.972626 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:13.972626 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:13.973189 master-0 kubenswrapper[7642]: I0319 12:02:13.972695 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:14.971941 master-0 kubenswrapper[7642]: I0319 12:02:14.971869 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:14.971941 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:14.971941 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:14.971941 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:14.972626 master-0 kubenswrapper[7642]: I0319 12:02:14.971973 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:15.971404 master-0 kubenswrapper[7642]: I0319 12:02:15.971310 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:15.971404 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:15.971404 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:15.971404 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:15.971836 master-0 kubenswrapper[7642]: I0319 12:02:15.971416 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:16.971773 master-0 kubenswrapper[7642]: I0319 12:02:16.971661 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:16.971773 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:16.971773 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:16.971773 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:16.972807 master-0 kubenswrapper[7642]: I0319 12:02:16.971795 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:17.189902 master-0 kubenswrapper[7642]: I0319 12:02:17.189810 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 12:02:17.190851 master-0 kubenswrapper[7642]: I0319 12:02:17.190819 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.193538 master-0 kubenswrapper[7642]: I0319 12:02:17.193488 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-n86v2" Mar 19 12:02:17.193807 master-0 kubenswrapper[7642]: I0319 12:02:17.193781 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 19 12:02:17.204215 master-0 kubenswrapper[7642]: I0319 12:02:17.204162 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 12:02:17.268845 master-0 kubenswrapper[7642]: I0319 12:02:17.268566 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-var-lock\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.268845 master-0 kubenswrapper[7642]: I0319 12:02:17.268704 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.269200 master-0 kubenswrapper[7642]: I0319 12:02:17.268869 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kube-api-access\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.369877 master-0 kubenswrapper[7642]: I0319 12:02:17.369795 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kube-api-access\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.370129 master-0 kubenswrapper[7642]: I0319 12:02:17.370044 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-var-lock\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.370129 master-0 kubenswrapper[7642]: I0319 12:02:17.370091 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.370231 master-0 kubenswrapper[7642]: I0319 12:02:17.370193 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-var-lock\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.370272 master-0 kubenswrapper[7642]: I0319 12:02:17.370206 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.388183 master-0 kubenswrapper[7642]: I0319 12:02:17.388111 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kube-api-access\") pod \"installer-2-master-0\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.511282 master-0 kubenswrapper[7642]: I0319 12:02:17.511217 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:02:17.932084 master-0 kubenswrapper[7642]: I0319 12:02:17.932023 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 19 12:02:17.941988 master-0 kubenswrapper[7642]: W0319 12:02:17.941957 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod44d258c9_f8b3_4233_847f_bc594a2aa4f9.slice/crio-31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17 WatchSource:0}: Error finding container 31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17: Status 404 returned error can't find the container with id 31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17 Mar 19 12:02:17.971297 master-0 kubenswrapper[7642]: I0319 12:02:17.971214 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:17.971297 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:17.971297 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:17.971297 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:17.971530 master-0 kubenswrapper[7642]: I0319 12:02:17.971352 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:18.061309 master-0 kubenswrapper[7642]: I0319 12:02:18.061238 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"44d258c9-f8b3-4233-847f-bc594a2aa4f9","Type":"ContainerStarted","Data":"31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17"} Mar 19 12:02:18.971223 master-0 kubenswrapper[7642]: I0319 12:02:18.971151 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:18.971223 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:18.971223 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:18.971223 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:18.971556 master-0 kubenswrapper[7642]: I0319 12:02:18.971228 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:19.068339 master-0 kubenswrapper[7642]: I0319 12:02:19.068249 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"44d258c9-f8b3-4233-847f-bc594a2aa4f9","Type":"ContainerStarted","Data":"64567b7ee5dbb1259b892f9e1f4fd87cb14b22e497b596ff7c977ffde93bc8bd"} Mar 19 12:02:19.088834 master-0 kubenswrapper[7642]: I0319 12:02:19.088728 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.088701258 podStartE2EDuration="2.088701258s" podCreationTimestamp="2026-03-19 12:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:02:19.087769481 +0000 UTC m=+497.205210308" watchObservedRunningTime="2026-03-19 12:02:19.088701258 +0000 UTC m=+497.206142075" Mar 19 12:02:19.972157 master-0 kubenswrapper[7642]: I0319 12:02:19.972054 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:19.972157 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:19.972157 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:19.972157 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:19.972816 master-0 kubenswrapper[7642]: I0319 12:02:19.972161 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:20.971786 master-0 kubenswrapper[7642]: I0319 12:02:20.971708 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:20.971786 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:20.971786 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:20.971786 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:20.972458 master-0 kubenswrapper[7642]: I0319 12:02:20.971805 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:21.971413 master-0 kubenswrapper[7642]: I0319 12:02:21.971339 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:21.971413 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:21.971413 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:21.971413 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:21.972659 master-0 kubenswrapper[7642]: I0319 12:02:21.971428 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:22.972772 master-0 kubenswrapper[7642]: I0319 12:02:22.972669 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:22.972772 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:22.972772 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:22.972772 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:22.973582 master-0 kubenswrapper[7642]: I0319 12:02:22.972778 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:23.270368 master-0 kubenswrapper[7642]: I0319 12:02:23.270157 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:02:23.270922 master-0 kubenswrapper[7642]: I0319 12:02:23.270852 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" containerID="cri-o://2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" gracePeriod=30 Mar 19 12:02:23.271040 master-0 kubenswrapper[7642]: I0319 12:02:23.270938 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" containerID="cri-o://2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" gracePeriod=30 Mar 19 12:02:23.271137 master-0 kubenswrapper[7642]: I0319 12:02:23.270933 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" containerID="cri-o://39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" gracePeriod=30 Mar 19 12:02:23.272412 master-0 kubenswrapper[7642]: I0319 12:02:23.272382 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:02:23.272719 master-0 kubenswrapper[7642]: E0319 12:02:23.272695 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 12:02:23.272719 master-0 kubenswrapper[7642]: I0319 12:02:23.272717 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 12:02:23.272812 master-0 kubenswrapper[7642]: E0319 12:02:23.272733 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 12:02:23.272812 master-0 kubenswrapper[7642]: I0319 12:02:23.272741 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 12:02:23.272812 master-0 kubenswrapper[7642]: E0319 12:02:23.272753 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 12:02:23.272812 master-0 kubenswrapper[7642]: I0319 12:02:23.272759 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 12:02:23.272812 master-0 kubenswrapper[7642]: E0319 12:02:23.272788 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="wait-for-host-port" Mar 19 12:02:23.272812 master-0 kubenswrapper[7642]: I0319 12:02:23.272794 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="wait-for-host-port" Mar 19 12:02:23.273029 master-0 kubenswrapper[7642]: I0319 12:02:23.272922 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler" Mar 19 12:02:23.273029 master-0 kubenswrapper[7642]: I0319 12:02:23.272938 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-recovery-controller" Mar 19 12:02:23.273029 master-0 kubenswrapper[7642]: I0319 12:02:23.272954 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="8413125cf444e5c95f023c5dd9c6151e" containerName="kube-scheduler-cert-syncer" Mar 19 12:02:23.433689 master-0 kubenswrapper[7642]: I0319 12:02:23.433599 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 12:02:23.435058 master-0 kubenswrapper[7642]: I0319 12:02:23.435014 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.439319 master-0 kubenswrapper[7642]: I0319 12:02:23.439274 7642 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 12:02:23.468884 master-0 kubenswrapper[7642]: I0319 12:02:23.468730 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") pod \"8413125cf444e5c95f023c5dd9c6151e\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " Mar 19 12:02:23.468884 master-0 kubenswrapper[7642]: I0319 12:02:23.468872 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8413125cf444e5c95f023c5dd9c6151e" (UID: "8413125cf444e5c95f023c5dd9c6151e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:23.469313 master-0 kubenswrapper[7642]: I0319 12:02:23.468945 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") pod \"8413125cf444e5c95f023c5dd9c6151e\" (UID: \"8413125cf444e5c95f023c5dd9c6151e\") " Mar 19 12:02:23.469313 master-0 kubenswrapper[7642]: I0319 12:02:23.469071 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.469313 master-0 kubenswrapper[7642]: I0319 12:02:23.469106 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.469313 master-0 kubenswrapper[7642]: I0319 12:02:23.469163 7642 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:23.469313 master-0 kubenswrapper[7642]: I0319 12:02:23.469193 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8413125cf444e5c95f023c5dd9c6151e" (UID: "8413125cf444e5c95f023c5dd9c6151e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:23.570883 master-0 kubenswrapper[7642]: I0319 12:02:23.570681 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.570883 master-0 kubenswrapper[7642]: I0319 12:02:23.570761 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.570883 master-0 kubenswrapper[7642]: I0319 12:02:23.570868 7642 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8413125cf444e5c95f023c5dd9c6151e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:23.571263 master-0 kubenswrapper[7642]: I0319 12:02:23.570872 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.571263 master-0 kubenswrapper[7642]: I0319 12:02:23.570980 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:23.972894 master-0 kubenswrapper[7642]: I0319 12:02:23.972809 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:23.972894 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:23.972894 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:23.972894 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:23.973565 master-0 kubenswrapper[7642]: I0319 12:02:23.972912 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:24.079779 master-0 kubenswrapper[7642]: I0319 12:02:24.079693 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8413125cf444e5c95f023c5dd9c6151e" path="/var/lib/kubelet/pods/8413125cf444e5c95f023c5dd9c6151e/volumes" Mar 19 12:02:24.102509 master-0 kubenswrapper[7642]: I0319 12:02:24.102449 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8413125cf444e5c95f023c5dd9c6151e/kube-scheduler-cert-syncer/0.log" Mar 19 12:02:24.103560 master-0 kubenswrapper[7642]: I0319 12:02:24.103511 7642 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" exitCode=0 Mar 19 12:02:24.103560 master-0 kubenswrapper[7642]: I0319 12:02:24.103549 7642 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" exitCode=2 Mar 19 12:02:24.103560 master-0 kubenswrapper[7642]: I0319 12:02:24.103562 7642 generic.go:334] "Generic (PLEG): container finished" podID="8413125cf444e5c95f023c5dd9c6151e" containerID="2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" exitCode=0 Mar 19 12:02:24.103714 master-0 kubenswrapper[7642]: I0319 12:02:24.103669 7642 scope.go:117] "RemoveContainer" containerID="39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" Mar 19 12:02:24.103714 master-0 kubenswrapper[7642]: I0319 12:02:24.103676 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:24.108039 master-0 kubenswrapper[7642]: I0319 12:02:24.107993 7642 generic.go:334] "Generic (PLEG): container finished" podID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerID="f7e4a7cf437941fedc829554051c6d66a149aed6e58ff2aaeede0de923f84868" exitCode=0 Mar 19 12:02:24.108174 master-0 kubenswrapper[7642]: I0319 12:02:24.108088 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"72c2ed5a-4b8d-4528-a230-fe608ea55ad3","Type":"ContainerDied","Data":"f7e4a7cf437941fedc829554051c6d66a149aed6e58ff2aaeede0de923f84868"} Mar 19 12:02:24.108626 master-0 kubenswrapper[7642]: I0319 12:02:24.108592 7642 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 12:02:24.129303 master-0 kubenswrapper[7642]: I0319 12:02:24.128154 7642 scope.go:117] "RemoveContainer" containerID="2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" Mar 19 12:02:24.149157 master-0 kubenswrapper[7642]: I0319 12:02:24.149059 7642 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="8413125cf444e5c95f023c5dd9c6151e" podUID="8e27b7d086edf5d2cf47b703574641d8" Mar 19 12:02:24.155887 master-0 kubenswrapper[7642]: I0319 12:02:24.155820 7642 scope.go:117] "RemoveContainer" containerID="2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" Mar 19 12:02:24.174644 master-0 kubenswrapper[7642]: I0319 12:02:24.174592 7642 scope.go:117] "RemoveContainer" containerID="2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75" Mar 19 12:02:24.194973 master-0 kubenswrapper[7642]: I0319 12:02:24.194913 7642 scope.go:117] "RemoveContainer" containerID="39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" Mar 19 12:02:24.195577 master-0 kubenswrapper[7642]: E0319 12:02:24.195529 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": container with ID starting with 39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4 not found: ID does not exist" containerID="39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" Mar 19 12:02:24.195641 master-0 kubenswrapper[7642]: I0319 12:02:24.195580 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4"} err="failed to get container status \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": rpc error: code = NotFound desc = could not find container \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": container with ID starting with 39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4 not found: ID does not exist" Mar 19 12:02:24.195641 master-0 kubenswrapper[7642]: I0319 12:02:24.195611 7642 scope.go:117] "RemoveContainer" containerID="2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" Mar 19 12:02:24.196159 master-0 kubenswrapper[7642]: E0319 12:02:24.196105 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": container with ID starting with 2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1 not found: ID does not exist" containerID="2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" Mar 19 12:02:24.196159 master-0 kubenswrapper[7642]: I0319 12:02:24.196143 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1"} err="failed to get container status \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": rpc error: code = NotFound desc = could not find container \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": container with ID starting with 2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1 not found: ID does not exist" Mar 19 12:02:24.196159 master-0 kubenswrapper[7642]: I0319 12:02:24.196165 7642 scope.go:117] "RemoveContainer" containerID="2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" Mar 19 12:02:24.196801 master-0 kubenswrapper[7642]: E0319 12:02:24.196765 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": container with ID starting with 2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836 not found: ID does not exist" containerID="2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" Mar 19 12:02:24.196801 master-0 kubenswrapper[7642]: I0319 12:02:24.196788 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836"} err="failed to get container status \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": rpc error: code = NotFound desc = could not find container \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": container with ID starting with 2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836 not found: ID does not exist" Mar 19 12:02:24.196909 master-0 kubenswrapper[7642]: I0319 12:02:24.196802 7642 scope.go:117] "RemoveContainer" containerID="2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75" Mar 19 12:02:24.197326 master-0 kubenswrapper[7642]: E0319 12:02:24.197281 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": container with ID starting with 2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75 not found: ID does not exist" containerID="2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75" Mar 19 12:02:24.197399 master-0 kubenswrapper[7642]: I0319 12:02:24.197345 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75"} err="failed to get container status \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": rpc error: code = NotFound desc = could not find container \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": container with ID starting with 2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75 not found: ID does not exist" Mar 19 12:02:24.197399 master-0 kubenswrapper[7642]: I0319 12:02:24.197390 7642 scope.go:117] "RemoveContainer" containerID="39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" Mar 19 12:02:24.197907 master-0 kubenswrapper[7642]: I0319 12:02:24.197868 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4"} err="failed to get container status \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": rpc error: code = NotFound desc = could not find container \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": container with ID starting with 39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4 not found: ID does not exist" Mar 19 12:02:24.197971 master-0 kubenswrapper[7642]: I0319 12:02:24.197912 7642 scope.go:117] "RemoveContainer" containerID="2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" Mar 19 12:02:24.198265 master-0 kubenswrapper[7642]: I0319 12:02:24.198234 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1"} err="failed to get container status \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": rpc error: code = NotFound desc = could not find container \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": container with ID starting with 2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1 not found: ID does not exist" Mar 19 12:02:24.198265 master-0 kubenswrapper[7642]: I0319 12:02:24.198256 7642 scope.go:117] "RemoveContainer" containerID="2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" Mar 19 12:02:24.198518 master-0 kubenswrapper[7642]: I0319 12:02:24.198489 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836"} err="failed to get container status \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": rpc error: code = NotFound desc = could not find container \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": container with ID starting with 2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836 not found: ID does not exist" Mar 19 12:02:24.198518 master-0 kubenswrapper[7642]: I0319 12:02:24.198512 7642 scope.go:117] "RemoveContainer" containerID="2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75" Mar 19 12:02:24.198839 master-0 kubenswrapper[7642]: I0319 12:02:24.198808 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75"} err="failed to get container status \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": rpc error: code = NotFound desc = could not find container \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": container with ID starting with 2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75 not found: ID does not exist" Mar 19 12:02:24.198839 master-0 kubenswrapper[7642]: I0319 12:02:24.198828 7642 scope.go:117] "RemoveContainer" containerID="39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4" Mar 19 12:02:24.199131 master-0 kubenswrapper[7642]: I0319 12:02:24.199092 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4"} err="failed to get container status \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": rpc error: code = NotFound desc = could not find container \"39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4\": container with ID starting with 39da4807ddbd885e6dfea872998ad3ba9b4c2f581d0171b5a5497e6d13ec82d4 not found: ID does not exist" Mar 19 12:02:24.199131 master-0 kubenswrapper[7642]: I0319 12:02:24.199120 7642 scope.go:117] "RemoveContainer" containerID="2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1" Mar 19 12:02:24.199377 master-0 kubenswrapper[7642]: I0319 12:02:24.199328 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1"} err="failed to get container status \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": rpc error: code = NotFound desc = could not find container \"2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1\": container with ID starting with 2cfc174d7f47c682f1bdcffd97c5118afb43b422106553ed00a33019317d05b1 not found: ID does not exist" Mar 19 12:02:24.199377 master-0 kubenswrapper[7642]: I0319 12:02:24.199368 7642 scope.go:117] "RemoveContainer" containerID="2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836" Mar 19 12:02:24.199676 master-0 kubenswrapper[7642]: I0319 12:02:24.199629 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836"} err="failed to get container status \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": rpc error: code = NotFound desc = could not find container \"2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836\": container with ID starting with 2f8e75b8b24ed097ee80eef279cc718daf2bb409ad72e81f9edcdf82111c3836 not found: ID does not exist" Mar 19 12:02:24.199676 master-0 kubenswrapper[7642]: I0319 12:02:24.199672 7642 scope.go:117] "RemoveContainer" containerID="2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75" Mar 19 12:02:24.199996 master-0 kubenswrapper[7642]: I0319 12:02:24.199961 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75"} err="failed to get container status \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": rpc error: code = NotFound desc = could not find container \"2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75\": container with ID starting with 2823ec7cd09e00428c22451b6970db550a6218ad890b00230fb99df3b69fcf75 not found: ID does not exist" Mar 19 12:02:24.972300 master-0 kubenswrapper[7642]: I0319 12:02:24.972232 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:24.972300 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:24.972300 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:24.972300 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:24.972746 master-0 kubenswrapper[7642]: I0319 12:02:24.972318 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:25.387986 master-0 kubenswrapper[7642]: I0319 12:02:25.387930 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:02:25.499212 master-0 kubenswrapper[7642]: I0319 12:02:25.499130 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-var-lock\") pod \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " Mar 19 12:02:25.499935 master-0 kubenswrapper[7642]: I0319 12:02:25.499887 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-var-lock" (OuterVolumeSpecName: "var-lock") pod "72c2ed5a-4b8d-4528-a230-fe608ea55ad3" (UID: "72c2ed5a-4b8d-4528-a230-fe608ea55ad3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:25.500715 master-0 kubenswrapper[7642]: I0319 12:02:25.500676 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kube-api-access\") pod \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " Mar 19 12:02:25.501096 master-0 kubenswrapper[7642]: I0319 12:02:25.501058 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kubelet-dir\") pod \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\" (UID: \"72c2ed5a-4b8d-4528-a230-fe608ea55ad3\") " Mar 19 12:02:25.501201 master-0 kubenswrapper[7642]: I0319 12:02:25.501180 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "72c2ed5a-4b8d-4528-a230-fe608ea55ad3" (UID: "72c2ed5a-4b8d-4528-a230-fe608ea55ad3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:02:25.501954 master-0 kubenswrapper[7642]: I0319 12:02:25.501897 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:25.501954 master-0 kubenswrapper[7642]: I0319 12:02:25.501930 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:25.503244 master-0 kubenswrapper[7642]: I0319 12:02:25.503194 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "72c2ed5a-4b8d-4528-a230-fe608ea55ad3" (UID: "72c2ed5a-4b8d-4528-a230-fe608ea55ad3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:02:25.605627 master-0 kubenswrapper[7642]: I0319 12:02:25.605043 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72c2ed5a-4b8d-4528-a230-fe608ea55ad3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:02:25.971682 master-0 kubenswrapper[7642]: I0319 12:02:25.971604 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:25.971682 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:25.971682 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:25.971682 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:25.972023 master-0 kubenswrapper[7642]: I0319 12:02:25.971696 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:26.132181 master-0 kubenswrapper[7642]: I0319 12:02:26.132101 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"72c2ed5a-4b8d-4528-a230-fe608ea55ad3","Type":"ContainerDied","Data":"7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17"} Mar 19 12:02:26.132181 master-0 kubenswrapper[7642]: I0319 12:02:26.132172 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17" Mar 19 12:02:26.132519 master-0 kubenswrapper[7642]: I0319 12:02:26.132272 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:02:26.972203 master-0 kubenswrapper[7642]: I0319 12:02:26.972119 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:26.972203 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:26.972203 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:26.972203 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:26.972823 master-0 kubenswrapper[7642]: I0319 12:02:26.972225 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:27.972320 master-0 kubenswrapper[7642]: I0319 12:02:27.972228 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:27.972320 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:27.972320 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:27.972320 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:27.972320 master-0 kubenswrapper[7642]: I0319 12:02:27.972317 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:28.972387 master-0 kubenswrapper[7642]: I0319 12:02:28.972290 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:28.972387 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:28.972387 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:28.972387 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:28.973558 master-0 kubenswrapper[7642]: I0319 12:02:28.972391 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:29.972009 master-0 kubenswrapper[7642]: I0319 12:02:29.971936 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:29.972009 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:29.972009 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:29.972009 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:29.972416 master-0 kubenswrapper[7642]: I0319 12:02:29.972017 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:30.971396 master-0 kubenswrapper[7642]: I0319 12:02:30.971317 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:30.971396 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:30.971396 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:30.971396 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:30.971396 master-0 kubenswrapper[7642]: I0319 12:02:30.971402 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:31.970854 master-0 kubenswrapper[7642]: I0319 12:02:31.970771 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:31.970854 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:31.970854 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:31.970854 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:31.970854 master-0 kubenswrapper[7642]: I0319 12:02:31.970855 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:32.972023 master-0 kubenswrapper[7642]: I0319 12:02:32.971917 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:32.972023 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:32.972023 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:32.972023 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:32.972872 master-0 kubenswrapper[7642]: I0319 12:02:32.972051 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:33.972300 master-0 kubenswrapper[7642]: I0319 12:02:33.972178 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:33.972300 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:33.972300 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:33.972300 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:33.972896 master-0 kubenswrapper[7642]: I0319 12:02:33.972368 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:34.971380 master-0 kubenswrapper[7642]: I0319 12:02:34.971333 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:34.971380 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:34.971380 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:34.971380 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:34.971758 master-0 kubenswrapper[7642]: I0319 12:02:34.971393 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:35.067740 master-0 kubenswrapper[7642]: I0319 12:02:35.067656 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:35.089113 master-0 kubenswrapper[7642]: I0319 12:02:35.089042 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="3abdd7cf-4ac8-4bca-a386-0765fd3ba7ea" Mar 19 12:02:35.089113 master-0 kubenswrapper[7642]: I0319 12:02:35.089099 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="3abdd7cf-4ac8-4bca-a386-0765fd3ba7ea" Mar 19 12:02:35.107463 master-0 kubenswrapper[7642]: I0319 12:02:35.107394 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:02:35.122095 master-0 kubenswrapper[7642]: I0319 12:02:35.122036 7642 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:35.127529 master-0 kubenswrapper[7642]: I0319 12:02:35.127446 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:02:35.138006 master-0 kubenswrapper[7642]: I0319 12:02:35.137956 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:35.146025 master-0 kubenswrapper[7642]: I0319 12:02:35.145967 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 19 12:02:35.165871 master-0 kubenswrapper[7642]: W0319 12:02:35.165821 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e27b7d086edf5d2cf47b703574641d8.slice/crio-d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a WatchSource:0}: Error finding container d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a: Status 404 returned error can't find the container with id d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a Mar 19 12:02:35.199653 master-0 kubenswrapper[7642]: I0319 12:02:35.199580 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a"} Mar 19 12:02:35.972409 master-0 kubenswrapper[7642]: I0319 12:02:35.972319 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:35.972409 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:35.972409 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:35.972409 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:35.972409 master-0 kubenswrapper[7642]: I0319 12:02:35.972425 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:36.207471 master-0 kubenswrapper[7642]: I0319 12:02:36.207383 7642 generic.go:334] "Generic (PLEG): container finished" podID="8e27b7d086edf5d2cf47b703574641d8" containerID="820218d0107a773549d9d05e3670e6a7a1f29fee796eba888ebf85af0aa9baa7" exitCode=0 Mar 19 12:02:36.208545 master-0 kubenswrapper[7642]: I0319 12:02:36.207471 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerDied","Data":"820218d0107a773549d9d05e3670e6a7a1f29fee796eba888ebf85af0aa9baa7"} Mar 19 12:02:36.971162 master-0 kubenswrapper[7642]: I0319 12:02:36.971081 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:36.971162 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:36.971162 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:36.971162 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:36.971469 master-0 kubenswrapper[7642]: I0319 12:02:36.971199 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:37.217538 master-0 kubenswrapper[7642]: I0319 12:02:37.217476 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"31cfcd5f067b44ea0b87c0a5aa2636c319389034426644966114a2ca89dbdeac"} Mar 19 12:02:37.217538 master-0 kubenswrapper[7642]: I0319 12:02:37.217535 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"27f0975af44b0e0c4b5a0b780624b9d03d718a16509a17ac2eda6856f398d1f3"} Mar 19 12:02:37.218172 master-0 kubenswrapper[7642]: I0319 12:02:37.217551 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"f043145314a3afe735f976ccbd7c9f8a4f3dcf66cb61565878dcf3c6a12fe880"} Mar 19 12:02:37.218172 master-0 kubenswrapper[7642]: I0319 12:02:37.217680 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:02:37.243657 master-0 kubenswrapper[7642]: I0319 12:02:37.243558 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.243537504 podStartE2EDuration="2.243537504s" podCreationTimestamp="2026-03-19 12:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:02:37.23955652 +0000 UTC m=+515.356997307" watchObservedRunningTime="2026-03-19 12:02:37.243537504 +0000 UTC m=+515.360978291" Mar 19 12:02:37.971401 master-0 kubenswrapper[7642]: I0319 12:02:37.971325 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:37.971401 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:37.971401 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:37.971401 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:37.971401 master-0 kubenswrapper[7642]: I0319 12:02:37.971390 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:38.971327 master-0 kubenswrapper[7642]: I0319 12:02:38.971256 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:38.971327 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:38.971327 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:38.971327 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:38.971941 master-0 kubenswrapper[7642]: I0319 12:02:38.971348 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:39.971923 master-0 kubenswrapper[7642]: I0319 12:02:39.971855 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:39.971923 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:39.971923 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:39.971923 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:39.972508 master-0 kubenswrapper[7642]: I0319 12:02:39.971930 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:40.972062 master-0 kubenswrapper[7642]: I0319 12:02:40.971982 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:40.972062 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:40.972062 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:40.972062 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:40.973053 master-0 kubenswrapper[7642]: I0319 12:02:40.972092 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:41.972763 master-0 kubenswrapper[7642]: I0319 12:02:41.972665 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:41.972763 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:41.972763 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:41.972763 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:41.972763 master-0 kubenswrapper[7642]: I0319 12:02:41.972743 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:42.972166 master-0 kubenswrapper[7642]: I0319 12:02:42.972048 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:42.972166 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:42.972166 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:42.972166 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:42.972616 master-0 kubenswrapper[7642]: I0319 12:02:42.972198 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:43.971601 master-0 kubenswrapper[7642]: I0319 12:02:43.971522 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:43.971601 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:43.971601 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:43.971601 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:43.972210 master-0 kubenswrapper[7642]: I0319 12:02:43.971613 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:44.972805 master-0 kubenswrapper[7642]: I0319 12:02:44.972706 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:44.972805 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:44.972805 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:44.972805 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:44.973809 master-0 kubenswrapper[7642]: I0319 12:02:44.972812 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:45.972596 master-0 kubenswrapper[7642]: I0319 12:02:45.972503 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:45.972596 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:45.972596 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:45.972596 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:45.973755 master-0 kubenswrapper[7642]: I0319 12:02:45.972608 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:46.972111 master-0 kubenswrapper[7642]: I0319 12:02:46.972007 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:46.972111 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:46.972111 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:46.972111 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:46.972581 master-0 kubenswrapper[7642]: I0319 12:02:46.972153 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:47.972091 master-0 kubenswrapper[7642]: I0319 12:02:47.972019 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:47.972091 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:47.972091 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:47.972091 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:47.973572 master-0 kubenswrapper[7642]: I0319 12:02:47.973521 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:48.971397 master-0 kubenswrapper[7642]: I0319 12:02:48.971343 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:48.971397 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:48.971397 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:48.971397 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:48.971703 master-0 kubenswrapper[7642]: I0319 12:02:48.971421 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:49.350988 master-0 kubenswrapper[7642]: I0319 12:02:49.350819 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:02:49.351655 master-0 kubenswrapper[7642]: I0319 12:02:49.351333 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" containerID="cri-o://e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1" gracePeriod=30 Mar 19 12:02:49.351655 master-0 kubenswrapper[7642]: I0319 12:02:49.351553 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" containerID="cri-o://5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802" gracePeriod=30 Mar 19 12:02:49.351655 master-0 kubenswrapper[7642]: I0319 12:02:49.351618 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" containerID="cri-o://4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574" gracePeriod=30 Mar 19 12:02:49.351811 master-0 kubenswrapper[7642]: I0319 12:02:49.351699 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" containerID="cri-o://7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad" gracePeriod=30 Mar 19 12:02:49.351811 master-0 kubenswrapper[7642]: I0319 12:02:49.351756 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" containerID="cri-o://fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b" gracePeriod=30 Mar 19 12:02:49.355190 master-0 kubenswrapper[7642]: I0319 12:02:49.355164 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:02:49.355610 master-0 kubenswrapper[7642]: E0319 12:02:49.355590 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 12:02:49.355757 master-0 kubenswrapper[7642]: I0319 12:02:49.355740 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 12:02:49.355862 master-0 kubenswrapper[7642]: E0319 12:02:49.355848 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 12:02:49.355946 master-0 kubenswrapper[7642]: I0319 12:02:49.355933 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 12:02:49.356034 master-0 kubenswrapper[7642]: E0319 12:02:49.356021 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 12:02:49.356149 master-0 kubenswrapper[7642]: I0319 12:02:49.356131 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 12:02:49.356266 master-0 kubenswrapper[7642]: E0319 12:02:49.356247 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerName="installer" Mar 19 12:02:49.361102 master-0 kubenswrapper[7642]: I0319 12:02:49.361070 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerName="installer" Mar 19 12:02:49.361341 master-0 kubenswrapper[7642]: E0319 12:02:49.361318 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 12:02:49.361468 master-0 kubenswrapper[7642]: I0319 12:02:49.361449 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 12:02:49.361584 master-0 kubenswrapper[7642]: E0319 12:02:49.361566 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 12:02:49.364959 master-0 kubenswrapper[7642]: I0319 12:02:49.364929 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-ensure-env-vars" Mar 19 12:02:49.365191 master-0 kubenswrapper[7642]: E0319 12:02:49.365162 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 12:02:49.365311 master-0 kubenswrapper[7642]: I0319 12:02:49.365292 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 12:02:49.365440 master-0 kubenswrapper[7642]: E0319 12:02:49.365420 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 12:02:49.365568 master-0 kubenswrapper[7642]: I0319 12:02:49.365548 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-resources-copy" Mar 19 12:02:49.365710 master-0 kubenswrapper[7642]: E0319 12:02:49.365689 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 12:02:49.365829 master-0 kubenswrapper[7642]: I0319 12:02:49.365810 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="setup" Mar 19 12:02:49.366259 master-0 kubenswrapper[7642]: I0319 12:02:49.366232 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-metrics" Mar 19 12:02:49.366375 master-0 kubenswrapper[7642]: I0319 12:02:49.366361 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-rev" Mar 19 12:02:49.366586 master-0 kubenswrapper[7642]: I0319 12:02:49.366570 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcdctl" Mar 19 12:02:49.366719 master-0 kubenswrapper[7642]: I0319 12:02:49.366702 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd" Mar 19 12:02:49.366842 master-0 kubenswrapper[7642]: I0319 12:02:49.366821 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerName="installer" Mar 19 12:02:49.366965 master-0 kubenswrapper[7642]: I0319 12:02:49.366945 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b4ed170d527099878cb5fdd508a2fb" containerName="etcd-readyz" Mar 19 12:02:49.548609 master-0 kubenswrapper[7642]: I0319 12:02:49.548528 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.548954 master-0 kubenswrapper[7642]: I0319 12:02:49.548699 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.548954 master-0 kubenswrapper[7642]: I0319 12:02:49.548876 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.549237 master-0 kubenswrapper[7642]: I0319 12:02:49.548952 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.549237 master-0 kubenswrapper[7642]: I0319 12:02:49.549028 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.549237 master-0 kubenswrapper[7642]: I0319 12:02:49.549132 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650463 master-0 kubenswrapper[7642]: I0319 12:02:49.650403 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650463 master-0 kubenswrapper[7642]: I0319 12:02:49.650467 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650793 master-0 kubenswrapper[7642]: I0319 12:02:49.650497 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650793 master-0 kubenswrapper[7642]: I0319 12:02:49.650551 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650793 master-0 kubenswrapper[7642]: I0319 12:02:49.650596 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650793 master-0 kubenswrapper[7642]: I0319 12:02:49.650619 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650793 master-0 kubenswrapper[7642]: I0319 12:02:49.650703 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.650793 master-0 kubenswrapper[7642]: I0319 12:02:49.650749 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.651211 master-0 kubenswrapper[7642]: I0319 12:02:49.650803 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.651211 master-0 kubenswrapper[7642]: I0319 12:02:49.650785 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.651211 master-0 kubenswrapper[7642]: I0319 12:02:49.650768 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.651211 master-0 kubenswrapper[7642]: I0319 12:02:49.650932 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:02:49.971709 master-0 kubenswrapper[7642]: I0319 12:02:49.971528 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:49.971709 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:49.971709 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:49.971709 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:49.972146 master-0 kubenswrapper[7642]: I0319 12:02:49.972106 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:50.320014 master-0 kubenswrapper[7642]: I0319 12:02:50.319886 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 12:02:50.321150 master-0 kubenswrapper[7642]: I0319 12:02:50.321114 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 12:02:50.324499 master-0 kubenswrapper[7642]: I0319 12:02:50.324450 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802" exitCode=2 Mar 19 12:02:50.324994 master-0 kubenswrapper[7642]: I0319 12:02:50.324968 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574" exitCode=0 Mar 19 12:02:50.325146 master-0 kubenswrapper[7642]: I0319 12:02:50.325124 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad" exitCode=2 Mar 19 12:02:50.971171 master-0 kubenswrapper[7642]: I0319 12:02:50.971118 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:50.971171 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:50.971171 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:50.971171 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:50.972327 master-0 kubenswrapper[7642]: I0319 12:02:50.971178 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:51.971027 master-0 kubenswrapper[7642]: I0319 12:02:51.970927 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:51.971027 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:51.971027 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:51.971027 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:51.971741 master-0 kubenswrapper[7642]: I0319 12:02:51.971061 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:52.972712 master-0 kubenswrapper[7642]: I0319 12:02:52.972589 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:52.972712 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:52.972712 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:52.972712 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:52.973935 master-0 kubenswrapper[7642]: I0319 12:02:52.972711 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:53.971356 master-0 kubenswrapper[7642]: I0319 12:02:53.971282 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:53.971356 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:53.971356 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:53.971356 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:53.971861 master-0 kubenswrapper[7642]: I0319 12:02:53.971356 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:54.971462 master-0 kubenswrapper[7642]: I0319 12:02:54.971399 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:54.971462 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:54.971462 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:54.971462 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:54.972060 master-0 kubenswrapper[7642]: I0319 12:02:54.971481 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:55.972347 master-0 kubenswrapper[7642]: I0319 12:02:55.972268 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:55.972347 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:55.972347 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:55.972347 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:55.972347 master-0 kubenswrapper[7642]: I0319 12:02:55.972348 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:56.971098 master-0 kubenswrapper[7642]: I0319 12:02:56.971015 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:56.971098 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:56.971098 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:56.971098 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:56.971098 master-0 kubenswrapper[7642]: I0319 12:02:56.971090 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:57.971955 master-0 kubenswrapper[7642]: I0319 12:02:57.971882 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:57.971955 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:57.971955 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:57.971955 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:57.972566 master-0 kubenswrapper[7642]: I0319 12:02:57.971992 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:58.972534 master-0 kubenswrapper[7642]: I0319 12:02:58.972435 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:58.972534 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:58.972534 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:58.972534 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:58.973425 master-0 kubenswrapper[7642]: I0319 12:02:58.972551 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:02:59.972203 master-0 kubenswrapper[7642]: I0319 12:02:59.972136 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:02:59.972203 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:02:59.972203 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:02:59.972203 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:02:59.972602 master-0 kubenswrapper[7642]: I0319 12:02:59.972209 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:00.974775 master-0 kubenswrapper[7642]: I0319 12:03:00.974684 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:00.974775 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:00.974775 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:00.974775 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:00.975353 master-0 kubenswrapper[7642]: I0319 12:03:00.974830 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:01.972515 master-0 kubenswrapper[7642]: I0319 12:03:01.972451 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:01.972515 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:01.972515 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:01.972515 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:01.972931 master-0 kubenswrapper[7642]: I0319 12:03:01.972529 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:02.414349 master-0 kubenswrapper[7642]: I0319 12:03:02.414284 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:03:02.414936 master-0 kubenswrapper[7642]: I0319 12:03:02.414365 7642 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f" exitCode=1 Mar 19 12:03:02.414936 master-0 kubenswrapper[7642]: I0319 12:03:02.414408 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f"} Mar 19 12:03:02.415183 master-0 kubenswrapper[7642]: I0319 12:03:02.415126 7642 scope.go:117] "RemoveContainer" containerID="4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f" Mar 19 12:03:02.683672 master-0 kubenswrapper[7642]: E0319 12:03:02.683597 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:03:02.972025 master-0 kubenswrapper[7642]: I0319 12:03:02.971984 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:02.972025 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:02.972025 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:02.972025 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:02.972433 master-0 kubenswrapper[7642]: I0319 12:03:02.972404 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:03.423244 master-0 kubenswrapper[7642]: I0319 12:03:03.423187 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:03:03.423244 master-0 kubenswrapper[7642]: I0319 12:03:03.423254 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"dfc697cd77881315e28c4429b102a58e90448d7960e79b2437cb6fdd1f387a62"} Mar 19 12:03:03.972293 master-0 kubenswrapper[7642]: I0319 12:03:03.972157 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:03.972293 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:03.972293 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:03.972293 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:03.972601 master-0 kubenswrapper[7642]: I0319 12:03:03.972307 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:04.431607 master-0 kubenswrapper[7642]: I0319 12:03:04.431539 7642 generic.go:334] "Generic (PLEG): container finished" podID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerID="64567b7ee5dbb1259b892f9e1f4fd87cb14b22e497b596ff7c977ffde93bc8bd" exitCode=0 Mar 19 12:03:04.431607 master-0 kubenswrapper[7642]: I0319 12:03:04.431605 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"44d258c9-f8b3-4233-847f-bc594a2aa4f9","Type":"ContainerDied","Data":"64567b7ee5dbb1259b892f9e1f4fd87cb14b22e497b596ff7c977ffde93bc8bd"} Mar 19 12:03:04.971515 master-0 kubenswrapper[7642]: I0319 12:03:04.971427 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:04.971515 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:04.971515 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:04.971515 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:04.971898 master-0 kubenswrapper[7642]: I0319 12:03:04.971540 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:05.135044 master-0 kubenswrapper[7642]: I0319 12:03:05.134943 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:03:05.135044 master-0 kubenswrapper[7642]: I0319 12:03:05.135027 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:03:05.143870 master-0 kubenswrapper[7642]: I0319 12:03:05.143795 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:03:05.709735 master-0 kubenswrapper[7642]: I0319 12:03:05.709605 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:03:05.889479 master-0 kubenswrapper[7642]: I0319 12:03:05.889404 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kube-api-access\") pod \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " Mar 19 12:03:05.889807 master-0 kubenswrapper[7642]: I0319 12:03:05.889600 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-var-lock\") pod \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " Mar 19 12:03:05.889807 master-0 kubenswrapper[7642]: I0319 12:03:05.889625 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kubelet-dir\") pod \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\" (UID: \"44d258c9-f8b3-4233-847f-bc594a2aa4f9\") " Mar 19 12:03:05.889807 master-0 kubenswrapper[7642]: I0319 12:03:05.889736 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-var-lock" (OuterVolumeSpecName: "var-lock") pod "44d258c9-f8b3-4233-847f-bc594a2aa4f9" (UID: "44d258c9-f8b3-4233-847f-bc594a2aa4f9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:05.889964 master-0 kubenswrapper[7642]: I0319 12:03:05.889816 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "44d258c9-f8b3-4233-847f-bc594a2aa4f9" (UID: "44d258c9-f8b3-4233-847f-bc594a2aa4f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:05.890018 master-0 kubenswrapper[7642]: I0319 12:03:05.889966 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:05.890018 master-0 kubenswrapper[7642]: I0319 12:03:05.889980 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:05.893058 master-0 kubenswrapper[7642]: I0319 12:03:05.892966 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "44d258c9-f8b3-4233-847f-bc594a2aa4f9" (UID: "44d258c9-f8b3-4233-847f-bc594a2aa4f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:03:05.972961 master-0 kubenswrapper[7642]: I0319 12:03:05.972726 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:05.972961 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:05.972961 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:05.972961 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:05.972961 master-0 kubenswrapper[7642]: I0319 12:03:05.972824 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:05.991571 master-0 kubenswrapper[7642]: I0319 12:03:05.991480 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44d258c9-f8b3-4233-847f-bc594a2aa4f9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:06.435260 master-0 kubenswrapper[7642]: I0319 12:03:06.435200 7642 scope.go:117] "RemoveContainer" containerID="78b8de87e8a5c898566cfde9569af5051ed935aae3421d721d98339159c91a61" Mar 19 12:03:06.447971 master-0 kubenswrapper[7642]: I0319 12:03:06.447898 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"44d258c9-f8b3-4233-847f-bc594a2aa4f9","Type":"ContainerDied","Data":"31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17"} Mar 19 12:03:06.447971 master-0 kubenswrapper[7642]: I0319 12:03:06.447957 7642 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17" Mar 19 12:03:06.448492 master-0 kubenswrapper[7642]: I0319 12:03:06.448450 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:03:06.974748 master-0 kubenswrapper[7642]: I0319 12:03:06.974686 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:06.974748 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:06.974748 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:06.974748 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:06.975437 master-0 kubenswrapper[7642]: I0319 12:03:06.974780 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:07.972103 master-0 kubenswrapper[7642]: I0319 12:03:07.972034 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:07.972103 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:07.972103 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:07.972103 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:07.972461 master-0 kubenswrapper[7642]: I0319 12:03:07.972158 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:08.972060 master-0 kubenswrapper[7642]: I0319 12:03:08.971978 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:08.972060 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:08.972060 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:08.972060 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:08.972816 master-0 kubenswrapper[7642]: I0319 12:03:08.972080 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:09.972973 master-0 kubenswrapper[7642]: I0319 12:03:09.972912 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:09.972973 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:09.972973 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:09.972973 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:09.973870 master-0 kubenswrapper[7642]: I0319 12:03:09.972983 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:10.972008 master-0 kubenswrapper[7642]: I0319 12:03:10.971883 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:10.972008 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:10.972008 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:10.972008 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:10.972611 master-0 kubenswrapper[7642]: I0319 12:03:10.972018 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:11.972733 master-0 kubenswrapper[7642]: I0319 12:03:11.972618 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:11.972733 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:11.972733 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:11.972733 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:11.973566 master-0 kubenswrapper[7642]: I0319 12:03:11.972738 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:12.684011 master-0 kubenswrapper[7642]: E0319 12:03:12.683930 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:03:12.971564 master-0 kubenswrapper[7642]: I0319 12:03:12.971437 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:12.971564 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:12.971564 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:12.971564 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:12.971564 master-0 kubenswrapper[7642]: I0319 12:03:12.971516 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:13.971792 master-0 kubenswrapper[7642]: I0319 12:03:13.971700 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:13.971792 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:13.971792 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:13.971792 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:13.972588 master-0 kubenswrapper[7642]: I0319 12:03:13.971796 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:14.971147 master-0 kubenswrapper[7642]: I0319 12:03:14.971071 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:14.971147 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:14.971147 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:14.971147 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:14.971546 master-0 kubenswrapper[7642]: I0319 12:03:14.971147 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:15.139416 master-0 kubenswrapper[7642]: I0319 12:03:15.139341 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:03:15.971353 master-0 kubenswrapper[7642]: I0319 12:03:15.971270 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:15.971353 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:15.971353 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:15.971353 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:15.971685 master-0 kubenswrapper[7642]: I0319 12:03:15.971363 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:16.972702 master-0 kubenswrapper[7642]: I0319 12:03:16.972582 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:16.972702 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:16.972702 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:16.972702 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:16.973534 master-0 kubenswrapper[7642]: I0319 12:03:16.972753 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:17.971353 master-0 kubenswrapper[7642]: I0319 12:03:17.971248 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:17.971353 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:17.971353 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:17.971353 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:17.971353 master-0 kubenswrapper[7642]: I0319 12:03:17.971323 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:18.971711 master-0 kubenswrapper[7642]: I0319 12:03:18.971616 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:18.971711 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:18.971711 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:18.971711 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:18.973020 master-0 kubenswrapper[7642]: I0319 12:03:18.972871 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:19.489358 master-0 kubenswrapper[7642]: I0319 12:03:19.489287 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 12:03:19.490832 master-0 kubenswrapper[7642]: I0319 12:03:19.490780 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 12:03:19.491840 master-0 kubenswrapper[7642]: I0319 12:03:19.491787 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 12:03:19.493215 master-0 kubenswrapper[7642]: I0319 12:03:19.493162 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:03:19.547007 master-0 kubenswrapper[7642]: I0319 12:03:19.546915 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-rev/0.log" Mar 19 12:03:19.548760 master-0 kubenswrapper[7642]: I0319 12:03:19.548714 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcd-metrics/0.log" Mar 19 12:03:19.550070 master-0 kubenswrapper[7642]: I0319 12:03:19.550015 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_24b4ed170d527099878cb5fdd508a2fb/etcdctl/0.log" Mar 19 12:03:19.552014 master-0 kubenswrapper[7642]: I0319 12:03:19.551941 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b" exitCode=0 Mar 19 12:03:19.552108 master-0 kubenswrapper[7642]: I0319 12:03:19.552014 7642 scope.go:117] "RemoveContainer" containerID="5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802" Mar 19 12:03:19.552108 master-0 kubenswrapper[7642]: I0319 12:03:19.552039 7642 generic.go:334] "Generic (PLEG): container finished" podID="24b4ed170d527099878cb5fdd508a2fb" containerID="e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1" exitCode=137 Mar 19 12:03:19.552108 master-0 kubenswrapper[7642]: I0319 12:03:19.552049 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:03:19.569377 master-0 kubenswrapper[7642]: I0319 12:03:19.569299 7642 scope.go:117] "RemoveContainer" containerID="4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574" Mar 19 12:03:19.589959 master-0 kubenswrapper[7642]: I0319 12:03:19.589891 7642 scope.go:117] "RemoveContainer" containerID="7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad" Mar 19 12:03:19.607357 master-0 kubenswrapper[7642]: I0319 12:03:19.607306 7642 scope.go:117] "RemoveContainer" containerID="fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b" Mar 19 12:03:19.610943 master-0 kubenswrapper[7642]: I0319 12:03:19.610892 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:03:19.611052 master-0 kubenswrapper[7642]: I0319 12:03:19.610949 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:03:19.611052 master-0 kubenswrapper[7642]: I0319 12:03:19.610977 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:03:19.611052 master-0 kubenswrapper[7642]: I0319 12:03:19.611003 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:03:19.611052 master-0 kubenswrapper[7642]: I0319 12:03:19.611019 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611074 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611102 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611133 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir" (OuterVolumeSpecName: "data-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611135 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir" (OuterVolumeSpecName: "log-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611188 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") pod \"24b4ed170d527099878cb5fdd508a2fb\" (UID: \"24b4ed170d527099878cb5fdd508a2fb\") " Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611208 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:19.611328 master-0 kubenswrapper[7642]: I0319 12:03:19.611304 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "24b4ed170d527099878cb5fdd508a2fb" (UID: "24b4ed170d527099878cb5fdd508a2fb"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:03:19.611854 master-0 kubenswrapper[7642]: I0319 12:03:19.611495 7642 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:19.611854 master-0 kubenswrapper[7642]: I0319 12:03:19.611514 7642 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:19.611854 master-0 kubenswrapper[7642]: I0319 12:03:19.611528 7642 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:19.611854 master-0 kubenswrapper[7642]: I0319 12:03:19.611539 7642 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:19.611854 master-0 kubenswrapper[7642]: I0319 12:03:19.611550 7642 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:19.611854 master-0 kubenswrapper[7642]: I0319 12:03:19.611562 7642 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/24b4ed170d527099878cb5fdd508a2fb-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 19 12:03:19.623594 master-0 kubenswrapper[7642]: I0319 12:03:19.623504 7642 scope.go:117] "RemoveContainer" containerID="e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1" Mar 19 12:03:19.636794 master-0 kubenswrapper[7642]: I0319 12:03:19.636763 7642 scope.go:117] "RemoveContainer" containerID="26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e" Mar 19 12:03:19.652305 master-0 kubenswrapper[7642]: I0319 12:03:19.652213 7642 scope.go:117] "RemoveContainer" containerID="8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6" Mar 19 12:03:19.669251 master-0 kubenswrapper[7642]: I0319 12:03:19.669194 7642 scope.go:117] "RemoveContainer" containerID="a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac" Mar 19 12:03:19.687484 master-0 kubenswrapper[7642]: I0319 12:03:19.687395 7642 scope.go:117] "RemoveContainer" containerID="5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802" Mar 19 12:03:19.688069 master-0 kubenswrapper[7642]: E0319 12:03:19.687997 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802\": container with ID starting with 5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802 not found: ID does not exist" containerID="5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802" Mar 19 12:03:19.688145 master-0 kubenswrapper[7642]: I0319 12:03:19.688080 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802"} err="failed to get container status \"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802\": rpc error: code = NotFound desc = could not find container \"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802\": container with ID starting with 5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802 not found: ID does not exist" Mar 19 12:03:19.688206 master-0 kubenswrapper[7642]: I0319 12:03:19.688149 7642 scope.go:117] "RemoveContainer" containerID="4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574" Mar 19 12:03:19.688972 master-0 kubenswrapper[7642]: E0319 12:03:19.688922 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574\": container with ID starting with 4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574 not found: ID does not exist" containerID="4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574" Mar 19 12:03:19.689045 master-0 kubenswrapper[7642]: I0319 12:03:19.688969 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574"} err="failed to get container status \"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574\": rpc error: code = NotFound desc = could not find container \"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574\": container with ID starting with 4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574 not found: ID does not exist" Mar 19 12:03:19.689045 master-0 kubenswrapper[7642]: I0319 12:03:19.688997 7642 scope.go:117] "RemoveContainer" containerID="7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad" Mar 19 12:03:19.689445 master-0 kubenswrapper[7642]: E0319 12:03:19.689403 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad\": container with ID starting with 7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad not found: ID does not exist" containerID="7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad" Mar 19 12:03:19.689445 master-0 kubenswrapper[7642]: I0319 12:03:19.689436 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad"} err="failed to get container status \"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad\": rpc error: code = NotFound desc = could not find container \"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad\": container with ID starting with 7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad not found: ID does not exist" Mar 19 12:03:19.689532 master-0 kubenswrapper[7642]: I0319 12:03:19.689452 7642 scope.go:117] "RemoveContainer" containerID="fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b" Mar 19 12:03:19.689869 master-0 kubenswrapper[7642]: E0319 12:03:19.689819 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b\": container with ID starting with fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b not found: ID does not exist" containerID="fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b" Mar 19 12:03:19.689968 master-0 kubenswrapper[7642]: I0319 12:03:19.689866 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b"} err="failed to get container status \"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b\": rpc error: code = NotFound desc = could not find container \"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b\": container with ID starting with fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b not found: ID does not exist" Mar 19 12:03:19.689968 master-0 kubenswrapper[7642]: I0319 12:03:19.689900 7642 scope.go:117] "RemoveContainer" containerID="e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1" Mar 19 12:03:19.691055 master-0 kubenswrapper[7642]: E0319 12:03:19.690145 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1\": container with ID starting with e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1 not found: ID does not exist" containerID="e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1" Mar 19 12:03:19.691286 master-0 kubenswrapper[7642]: I0319 12:03:19.691202 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1"} err="failed to get container status \"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1\": rpc error: code = NotFound desc = could not find container \"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1\": container with ID starting with e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1 not found: ID does not exist" Mar 19 12:03:19.691286 master-0 kubenswrapper[7642]: I0319 12:03:19.691272 7642 scope.go:117] "RemoveContainer" containerID="26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e" Mar 19 12:03:19.691769 master-0 kubenswrapper[7642]: E0319 12:03:19.691723 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e\": container with ID starting with 26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e not found: ID does not exist" containerID="26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e" Mar 19 12:03:19.691834 master-0 kubenswrapper[7642]: I0319 12:03:19.691774 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e"} err="failed to get container status \"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e\": rpc error: code = NotFound desc = could not find container \"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e\": container with ID starting with 26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e not found: ID does not exist" Mar 19 12:03:19.691834 master-0 kubenswrapper[7642]: I0319 12:03:19.691810 7642 scope.go:117] "RemoveContainer" containerID="8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6" Mar 19 12:03:19.692246 master-0 kubenswrapper[7642]: E0319 12:03:19.692203 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6\": container with ID starting with 8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6 not found: ID does not exist" containerID="8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6" Mar 19 12:03:19.692246 master-0 kubenswrapper[7642]: I0319 12:03:19.692233 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6"} err="failed to get container status \"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6\": rpc error: code = NotFound desc = could not find container \"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6\": container with ID starting with 8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6 not found: ID does not exist" Mar 19 12:03:19.692246 master-0 kubenswrapper[7642]: I0319 12:03:19.692249 7642 scope.go:117] "RemoveContainer" containerID="a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac" Mar 19 12:03:19.692592 master-0 kubenswrapper[7642]: E0319 12:03:19.692530 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac\": container with ID starting with a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac not found: ID does not exist" containerID="a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac" Mar 19 12:03:19.692592 master-0 kubenswrapper[7642]: I0319 12:03:19.692581 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac"} err="failed to get container status \"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac\": rpc error: code = NotFound desc = could not find container \"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac\": container with ID starting with a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac not found: ID does not exist" Mar 19 12:03:19.692592 master-0 kubenswrapper[7642]: I0319 12:03:19.692598 7642 scope.go:117] "RemoveContainer" containerID="5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802" Mar 19 12:03:19.692926 master-0 kubenswrapper[7642]: I0319 12:03:19.692882 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802"} err="failed to get container status \"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802\": rpc error: code = NotFound desc = could not find container \"5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802\": container with ID starting with 5d5be8d4da643b803861d50f35d0497d40faddd036419d49dfd905d921131802 not found: ID does not exist" Mar 19 12:03:19.692926 master-0 kubenswrapper[7642]: I0319 12:03:19.692915 7642 scope.go:117] "RemoveContainer" containerID="4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574" Mar 19 12:03:19.693227 master-0 kubenswrapper[7642]: I0319 12:03:19.693184 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574"} err="failed to get container status \"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574\": rpc error: code = NotFound desc = could not find container \"4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574\": container with ID starting with 4cff406b827a7d721ce719a2ae2d919110dafc956c586215dcfb027c19af4574 not found: ID does not exist" Mar 19 12:03:19.693227 master-0 kubenswrapper[7642]: I0319 12:03:19.693213 7642 scope.go:117] "RemoveContainer" containerID="7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad" Mar 19 12:03:19.693493 master-0 kubenswrapper[7642]: I0319 12:03:19.693459 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad"} err="failed to get container status \"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad\": rpc error: code = NotFound desc = could not find container \"7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad\": container with ID starting with 7e35b28541723cb65b16949644f004d0a6e20471e487df8935799c7458bdfaad not found: ID does not exist" Mar 19 12:03:19.693493 master-0 kubenswrapper[7642]: I0319 12:03:19.693484 7642 scope.go:117] "RemoveContainer" containerID="fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b" Mar 19 12:03:19.693772 master-0 kubenswrapper[7642]: I0319 12:03:19.693735 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b"} err="failed to get container status \"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b\": rpc error: code = NotFound desc = could not find container \"fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b\": container with ID starting with fc008b12add6be232b125878b25d95a40b1808cec72eeda9d001e86096b1dd2b not found: ID does not exist" Mar 19 12:03:19.693772 master-0 kubenswrapper[7642]: I0319 12:03:19.693760 7642 scope.go:117] "RemoveContainer" containerID="e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1" Mar 19 12:03:19.694096 master-0 kubenswrapper[7642]: I0319 12:03:19.694051 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1"} err="failed to get container status \"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1\": rpc error: code = NotFound desc = could not find container \"e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1\": container with ID starting with e2ee95e94096b9434531e794b1c6573bebb2e81e8f5c022b77577c1dc88b5ba1 not found: ID does not exist" Mar 19 12:03:19.694096 master-0 kubenswrapper[7642]: I0319 12:03:19.694077 7642 scope.go:117] "RemoveContainer" containerID="26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e" Mar 19 12:03:19.694389 master-0 kubenswrapper[7642]: I0319 12:03:19.694353 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e"} err="failed to get container status \"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e\": rpc error: code = NotFound desc = could not find container \"26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e\": container with ID starting with 26a2105e55f59027d483f315e021ae4fd824772d68464e460caec7b25dcc636e not found: ID does not exist" Mar 19 12:03:19.694389 master-0 kubenswrapper[7642]: I0319 12:03:19.694377 7642 scope.go:117] "RemoveContainer" containerID="8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6" Mar 19 12:03:19.694703 master-0 kubenswrapper[7642]: I0319 12:03:19.694665 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6"} err="failed to get container status \"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6\": rpc error: code = NotFound desc = could not find container \"8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6\": container with ID starting with 8f3ee40357f1bc6b2002acabcd78d23771b789302aa668f7bce31203af71a4a6 not found: ID does not exist" Mar 19 12:03:19.694703 master-0 kubenswrapper[7642]: I0319 12:03:19.694697 7642 scope.go:117] "RemoveContainer" containerID="a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac" Mar 19 12:03:19.695041 master-0 kubenswrapper[7642]: I0319 12:03:19.694973 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac"} err="failed to get container status \"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac\": rpc error: code = NotFound desc = could not find container \"a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac\": container with ID starting with a29784b270bbd368e8ca0c95b28ceb64235c7e76ebf0384b2dc21e6c6d6444ac not found: ID does not exist" Mar 19 12:03:19.972677 master-0 kubenswrapper[7642]: I0319 12:03:19.972546 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:19.972677 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:19.972677 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:19.972677 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:19.973474 master-0 kubenswrapper[7642]: I0319 12:03:19.972704 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:20.075444 master-0 kubenswrapper[7642]: I0319 12:03:20.075368 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b4ed170d527099878cb5fdd508a2fb" path="/var/lib/kubelet/pods/24b4ed170d527099878cb5fdd508a2fb/volumes" Mar 19 12:03:20.970958 master-0 kubenswrapper[7642]: I0319 12:03:20.970893 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:20.970958 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:20.970958 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:20.970958 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:20.971299 master-0 kubenswrapper[7642]: I0319 12:03:20.970984 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:21.972388 master-0 kubenswrapper[7642]: I0319 12:03:21.972299 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:21.972388 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:21.972388 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:21.972388 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:21.972388 master-0 kubenswrapper[7642]: I0319 12:03:21.972381 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:22.685392 master-0 kubenswrapper[7642]: E0319 12:03:22.684791 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:03:22.971388 master-0 kubenswrapper[7642]: I0319 12:03:22.971190 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:22.971388 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:22.971388 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:22.971388 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:22.971388 master-0 kubenswrapper[7642]: I0319 12:03:22.971271 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:23.358699 master-0 kubenswrapper[7642]: E0319 12:03:23.358425 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3c73598481e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Killing,Message:Stopping container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:02:49.351315939 +0000 UTC m=+527.468756736,LastTimestamp:2026-03-19 12:02:49.351315939 +0000 UTC m=+527.468756736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:03:23.972455 master-0 kubenswrapper[7642]: I0319 12:03:23.972333 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:23.972455 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:23.972455 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:23.972455 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:23.972455 master-0 kubenswrapper[7642]: I0319 12:03:23.972445 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:24.971878 master-0 kubenswrapper[7642]: I0319 12:03:24.971794 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:24.971878 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:24.971878 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:24.971878 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:24.972999 master-0 kubenswrapper[7642]: I0319 12:03:24.971891 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:25.142937 master-0 kubenswrapper[7642]: I0319 12:03:25.142889 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:03:25.972212 master-0 kubenswrapper[7642]: I0319 12:03:25.972049 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:25.972212 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:25.972212 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:25.972212 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:25.972212 master-0 kubenswrapper[7642]: I0319 12:03:25.972125 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:26.971583 master-0 kubenswrapper[7642]: I0319 12:03:26.971487 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:26.971583 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:26.971583 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:26.971583 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:26.971583 master-0 kubenswrapper[7642]: I0319 12:03:26.971567 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:27.067994 master-0 kubenswrapper[7642]: I0319 12:03:27.067864 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:03:27.087971 master-0 kubenswrapper[7642]: I0319 12:03:27.087881 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:03:27.087971 master-0 kubenswrapper[7642]: I0319 12:03:27.087939 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:03:27.971970 master-0 kubenswrapper[7642]: I0319 12:03:27.971815 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:27.971970 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:27.971970 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:27.971970 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:27.971970 master-0 kubenswrapper[7642]: I0319 12:03:27.971903 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:28.971609 master-0 kubenswrapper[7642]: I0319 12:03:28.971511 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:28.971609 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:28.971609 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:28.971609 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:28.971609 master-0 kubenswrapper[7642]: I0319 12:03:28.971595 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:29.972440 master-0 kubenswrapper[7642]: I0319 12:03:29.972312 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:29.972440 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:29.972440 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:29.972440 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:29.973533 master-0 kubenswrapper[7642]: I0319 12:03:29.972444 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:30.972315 master-0 kubenswrapper[7642]: I0319 12:03:30.972218 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:30.972315 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:30.972315 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:30.972315 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:30.973438 master-0 kubenswrapper[7642]: I0319 12:03:30.973395 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:31.971798 master-0 kubenswrapper[7642]: I0319 12:03:31.971721 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:31.971798 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:31.971798 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:31.971798 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:31.972147 master-0 kubenswrapper[7642]: I0319 12:03:31.971803 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:32.685962 master-0 kubenswrapper[7642]: E0319 12:03:32.685774 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:03:32.972345 master-0 kubenswrapper[7642]: I0319 12:03:32.972205 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:32.972345 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:32.972345 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:32.972345 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:32.972780 master-0 kubenswrapper[7642]: I0319 12:03:32.972340 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:33.972457 master-0 kubenswrapper[7642]: I0319 12:03:33.972350 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:33.972457 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:33.972457 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:33.972457 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:33.972457 master-0 kubenswrapper[7642]: I0319 12:03:33.972444 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:34.975562 master-0 kubenswrapper[7642]: I0319 12:03:34.975482 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:34.975562 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:34.975562 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:34.975562 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:34.976536 master-0 kubenswrapper[7642]: I0319 12:03:34.975575 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:35.972343 master-0 kubenswrapper[7642]: I0319 12:03:35.972234 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:35.972343 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:35.972343 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:35.972343 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:35.972712 master-0 kubenswrapper[7642]: I0319 12:03:35.972353 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:36.971721 master-0 kubenswrapper[7642]: I0319 12:03:36.971656 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:36.971721 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:36.971721 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:36.971721 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:36.972983 master-0 kubenswrapper[7642]: I0319 12:03:36.971743 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:37.972537 master-0 kubenswrapper[7642]: I0319 12:03:37.972466 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:37.972537 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:37.972537 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:37.972537 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:37.973249 master-0 kubenswrapper[7642]: I0319 12:03:37.972557 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:38.972016 master-0 kubenswrapper[7642]: I0319 12:03:38.971929 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:38.972016 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:38.972016 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:38.972016 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:38.972329 master-0 kubenswrapper[7642]: I0319 12:03:38.972052 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:39.972168 master-0 kubenswrapper[7642]: I0319 12:03:39.972077 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:39.972168 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:39.972168 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:39.972168 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:39.972827 master-0 kubenswrapper[7642]: I0319 12:03:39.972190 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:40.972494 master-0 kubenswrapper[7642]: I0319 12:03:40.972415 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:40.972494 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:40.972494 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:40.972494 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:40.973166 master-0 kubenswrapper[7642]: I0319 12:03:40.972498 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:41.971885 master-0 kubenswrapper[7642]: I0319 12:03:41.971793 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:41.971885 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:41.971885 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:41.971885 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:41.971885 master-0 kubenswrapper[7642]: I0319 12:03:41.971882 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:42.686282 master-0 kubenswrapper[7642]: E0319 12:03:42.686170 7642 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 19 12:03:42.686282 master-0 kubenswrapper[7642]: I0319 12:03:42.686263 7642 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 12:03:42.972518 master-0 kubenswrapper[7642]: I0319 12:03:42.972392 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:42.972518 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:42.972518 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:42.972518 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:42.972812 master-0 kubenswrapper[7642]: I0319 12:03:42.972498 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:43.972681 master-0 kubenswrapper[7642]: I0319 12:03:43.972578 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:43.972681 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:43.972681 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:43.972681 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:43.973756 master-0 kubenswrapper[7642]: I0319 12:03:43.972733 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:44.972594 master-0 kubenswrapper[7642]: I0319 12:03:44.972505 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:44.972594 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:44.972594 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:44.972594 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:44.973313 master-0 kubenswrapper[7642]: I0319 12:03:44.972619 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:45.972087 master-0 kubenswrapper[7642]: I0319 12:03:45.971981 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:45.972087 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:45.972087 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:45.972087 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:45.972553 master-0 kubenswrapper[7642]: I0319 12:03:45.972213 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:46.971922 master-0 kubenswrapper[7642]: I0319 12:03:46.971794 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:46.971922 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:46.971922 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:46.971922 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:46.972934 master-0 kubenswrapper[7642]: I0319 12:03:46.972012 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:47.750174 master-0 kubenswrapper[7642]: I0319 12:03:47.750095 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/1.log" Mar 19 12:03:47.751075 master-0 kubenswrapper[7642]: I0319 12:03:47.751010 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/0.log" Mar 19 12:03:47.751593 master-0 kubenswrapper[7642]: I0319 12:03:47.751530 7642 generic.go:334] "Generic (PLEG): container finished" podID="51e2c1d1-a50f-4a0d-8273-440e699b3907" containerID="6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef" exitCode=1 Mar 19 12:03:47.751694 master-0 kubenswrapper[7642]: I0319 12:03:47.751607 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerDied","Data":"6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef"} Mar 19 12:03:47.751805 master-0 kubenswrapper[7642]: I0319 12:03:47.751758 7642 scope.go:117] "RemoveContainer" containerID="52b485f2005783105cdf9d2569a5312d319a7c33f7fd721f2c4707505569003a" Mar 19 12:03:47.752594 master-0 kubenswrapper[7642]: I0319 12:03:47.752562 7642 scope.go:117] "RemoveContainer" containerID="6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef" Mar 19 12:03:47.752851 master-0 kubenswrapper[7642]: E0319 12:03:47.752809 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"approver\" with CrashLoopBackOff: \"back-off 10s restarting failed container=approver pod=network-node-identity-47j68_openshift-network-node-identity(51e2c1d1-a50f-4a0d-8273-440e699b3907)\"" pod="openshift-network-node-identity/network-node-identity-47j68" podUID="51e2c1d1-a50f-4a0d-8273-440e699b3907" Mar 19 12:03:47.972107 master-0 kubenswrapper[7642]: I0319 12:03:47.972001 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:47.972107 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:47.972107 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:47.972107 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:47.973158 master-0 kubenswrapper[7642]: I0319 12:03:47.972119 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:48.813438 master-0 kubenswrapper[7642]: I0319 12:03:48.813362 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/1.log" Mar 19 12:03:48.900765 master-0 kubenswrapper[7642]: E0319 12:03:48.900682 7642 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38df702_a256_4f83_90bb_0c0e477a2ce6.slice/crio-266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:03:48.973310 master-0 kubenswrapper[7642]: I0319 12:03:48.973226 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:48.973310 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:48.973310 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:48.973310 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:48.974490 master-0 kubenswrapper[7642]: I0319 12:03:48.973325 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:49.822287 master-0 kubenswrapper[7642]: I0319 12:03:49.822204 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/3.log" Mar 19 12:03:49.823033 master-0 kubenswrapper[7642]: I0319 12:03:49.822959 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/2.log" Mar 19 12:03:49.823522 master-0 kubenswrapper[7642]: I0319 12:03:49.823447 7642 generic.go:334] "Generic (PLEG): container finished" podID="f38df702-a256-4f83-90bb-0c0e477a2ce6" containerID="266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42" exitCode=1 Mar 19 12:03:49.823893 master-0 kubenswrapper[7642]: I0319 12:03:49.823517 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerDied","Data":"266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42"} Mar 19 12:03:49.823893 master-0 kubenswrapper[7642]: I0319 12:03:49.823736 7642 scope.go:117] "RemoveContainer" containerID="39cbc32af69e30f6839456a476f8b8093ef24ef3e6922a50809c966eae3795ad" Mar 19 12:03:49.824345 master-0 kubenswrapper[7642]: I0319 12:03:49.824297 7642 scope.go:117] "RemoveContainer" containerID="266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42" Mar 19 12:03:49.824611 master-0 kubenswrapper[7642]: E0319 12:03:49.824565 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:03:49.972626 master-0 kubenswrapper[7642]: I0319 12:03:49.972522 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:49.972626 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:49.972626 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:49.972626 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:49.973261 master-0 kubenswrapper[7642]: I0319 12:03:49.972643 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:50.835494 master-0 kubenswrapper[7642]: I0319 12:03:50.835410 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/3.log" Mar 19 12:03:50.972662 master-0 kubenswrapper[7642]: I0319 12:03:50.972510 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:50.972662 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:50.972662 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:50.972662 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:50.973083 master-0 kubenswrapper[7642]: I0319 12:03:50.972619 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:51.973187 master-0 kubenswrapper[7642]: I0319 12:03:51.973051 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:51.973187 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:51.973187 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:51.973187 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:51.974411 master-0 kubenswrapper[7642]: I0319 12:03:51.973189 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:52.687912 master-0 kubenswrapper[7642]: E0319 12:03:52.687785 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 19 12:03:52.971582 master-0 kubenswrapper[7642]: I0319 12:03:52.971382 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:52.971582 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:52.971582 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:52.971582 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:52.971582 master-0 kubenswrapper[7642]: I0319 12:03:52.971500 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:53.972412 master-0 kubenswrapper[7642]: I0319 12:03:53.972314 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:53.972412 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:53.972412 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:53.972412 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:53.973475 master-0 kubenswrapper[7642]: I0319 12:03:53.972409 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:54.990396 master-0 kubenswrapper[7642]: I0319 12:03:54.990305 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:54.990396 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:54.990396 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:54.990396 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:54.991461 master-0 kubenswrapper[7642]: I0319 12:03:54.990411 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:55.972699 master-0 kubenswrapper[7642]: I0319 12:03:55.972536 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:55.972699 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:55.972699 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:55.972699 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:55.972699 master-0 kubenswrapper[7642]: I0319 12:03:55.972669 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:56.973476 master-0 kubenswrapper[7642]: I0319 12:03:56.973394 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:56.973476 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:56.973476 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:56.973476 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:56.973476 master-0 kubenswrapper[7642]: I0319 12:03:56.973471 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:57.361766 master-0 kubenswrapper[7642]: E0319 12:03:57.361411 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189e3c735987f8d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:24b4ed170d527099878cb5fdd508a2fb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:02:49.351542995 +0000 UTC m=+527.468983792,LastTimestamp:2026-03-19 12:02:49.351542995 +0000 UTC m=+527.468983792,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:03:57.972402 master-0 kubenswrapper[7642]: I0319 12:03:57.972328 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:57.972402 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:57.972402 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:57.972402 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:57.972801 master-0 kubenswrapper[7642]: I0319 12:03:57.972434 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:58.972839 master-0 kubenswrapper[7642]: I0319 12:03:58.972733 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:58.972839 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:58.972839 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:58.972839 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:58.972839 master-0 kubenswrapper[7642]: I0319 12:03:58.972822 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:03:59.971850 master-0 kubenswrapper[7642]: I0319 12:03:59.971740 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:03:59.971850 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:03:59.971850 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:03:59.971850 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:03:59.972361 master-0 kubenswrapper[7642]: I0319 12:03:59.971856 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:00.067867 master-0 kubenswrapper[7642]: I0319 12:04:00.067802 7642 scope.go:117] "RemoveContainer" containerID="6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef" Mar 19 12:04:00.915258 master-0 kubenswrapper[7642]: I0319 12:04:00.915155 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/1.log" Mar 19 12:04:00.915743 master-0 kubenswrapper[7642]: I0319 12:04:00.915696 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"7972279b24eb7f8c4dea473b490207bcfc9b6d387850efeb2be0506a615c093f"} Mar 19 12:04:00.972661 master-0 kubenswrapper[7642]: I0319 12:04:00.972557 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:00.972661 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:00.972661 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:00.972661 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:00.972995 master-0 kubenswrapper[7642]: I0319 12:04:00.972683 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:01.091925 master-0 kubenswrapper[7642]: E0319 12:04:01.091846 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:04:01.092756 master-0 kubenswrapper[7642]: I0319 12:04:01.092475 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 19 12:04:01.116385 master-0 kubenswrapper[7642]: W0319 12:04:01.116265 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod094204df314fe45bd5af12ca1b4622bb.slice/crio-ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b WatchSource:0}: Error finding container ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b: Status 404 returned error can't find the container with id ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b Mar 19 12:04:01.927954 master-0 kubenswrapper[7642]: I0319 12:04:01.927851 7642 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="fc1a5002078bd201b824e0cb01e13c7c268e7dc6df2889e483abb0d9ab95ab7f" exitCode=0 Mar 19 12:04:01.927954 master-0 kubenswrapper[7642]: I0319 12:04:01.927916 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"fc1a5002078bd201b824e0cb01e13c7c268e7dc6df2889e483abb0d9ab95ab7f"} Mar 19 12:04:01.927954 master-0 kubenswrapper[7642]: I0319 12:04:01.927953 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b"} Mar 19 12:04:01.928381 master-0 kubenswrapper[7642]: I0319 12:04:01.928291 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:04:01.928381 master-0 kubenswrapper[7642]: I0319 12:04:01.928309 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:04:01.972569 master-0 kubenswrapper[7642]: I0319 12:04:01.972502 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:01.972569 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:01.972569 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:01.972569 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:01.972874 master-0 kubenswrapper[7642]: I0319 12:04:01.972670 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:01.972874 master-0 kubenswrapper[7642]: I0319 12:04:01.972739 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:04:01.973717 master-0 kubenswrapper[7642]: I0319 12:04:01.973666 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"4d73ea74c9f238cd74823cfff5517f40d001e3ed453b6b3cd60056853897aa07"} pod="openshift-ingress/router-default-7dcf5569b5-47tqf" containerMessage="Container router failed startup probe, will be restarted" Mar 19 12:04:01.973812 master-0 kubenswrapper[7642]: I0319 12:04:01.973740 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" containerID="cri-o://4d73ea74c9f238cd74823cfff5517f40d001e3ed453b6b3cd60056853897aa07" gracePeriod=3600 Mar 19 12:04:02.069271 master-0 kubenswrapper[7642]: I0319 12:04:02.069158 7642 scope.go:117] "RemoveContainer" containerID="266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42" Mar 19 12:04:02.069554 master-0 kubenswrapper[7642]: E0319 12:04:02.069374 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:04:02.418324 master-0 kubenswrapper[7642]: I0319 12:04:02.418205 7642 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Mar 19 12:04:02.889184 master-0 kubenswrapper[7642]: E0319 12:04:02.889105 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 19 12:04:06.496886 master-0 kubenswrapper[7642]: I0319 12:04:06.496767 7642 scope.go:117] "RemoveContainer" containerID="203765ba18c88a4bb80613cce48098f24f5378802947c4073ee2b353a34707ac" Mar 19 12:04:06.516240 master-0 kubenswrapper[7642]: I0319 12:04:06.516179 7642 scope.go:117] "RemoveContainer" containerID="c67f49e1211ff300919ba996552c5270a8c7964fe689dec11a1c7473a2f29c42" Mar 19 12:04:06.535831 master-0 kubenswrapper[7642]: I0319 12:04:06.535780 7642 scope.go:117] "RemoveContainer" containerID="1d8090aa91142bfe2d9ec6028d840d4af95fbe15328004054e0a0a8e1a71dfd6" Mar 19 12:04:13.290350 master-0 kubenswrapper[7642]: E0319 12:04:13.290233 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 19 12:04:15.068440 master-0 kubenswrapper[7642]: I0319 12:04:15.068351 7642 scope.go:117] "RemoveContainer" containerID="266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42" Mar 19 12:04:15.069116 master-0 kubenswrapper[7642]: E0319 12:04:15.068794 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:04:24.092233 master-0 kubenswrapper[7642]: E0319 12:04:24.091874 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 19 12:04:30.067901 master-0 kubenswrapper[7642]: I0319 12:04:30.067779 7642 scope.go:117] "RemoveContainer" containerID="266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42" Mar 19 12:04:30.348144 master-0 kubenswrapper[7642]: I0319 12:04:30.347961 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/3.log" Mar 19 12:04:30.348571 master-0 kubenswrapper[7642]: I0319 12:04:30.348509 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111"} Mar 19 12:04:31.367940 master-0 kubenswrapper[7642]: E0319 12:04:31.367714 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-master-0.189e3c4f73ecf322 openshift-kube-controller-manager 12769 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-master-0,UID:91ced7d9a1e8a2019ac2b6a262f0d19e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b23c544d3894e5b31f66a18c554f03b0d29f92c2000c46b57b1c96da7ec25db9\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:00:15 +0000 UTC,LastTimestamp:2026-03-19 12:03:02.416367722 +0000 UTC m=+540.533808509,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:04:35.693509 master-0 kubenswrapper[7642]: E0319 12:04:35.693400 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 19 12:04:35.931336 master-0 kubenswrapper[7642]: E0319 12:04:35.931196 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:04:36.395802 master-0 kubenswrapper[7642]: I0319 12:04:36.395686 7642 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="52e44a998be1c2142f3874b5ff4a8f40902b1f922d23a5db895af0f402eb6218" exitCode=0 Mar 19 12:04:36.395802 master-0 kubenswrapper[7642]: I0319 12:04:36.395764 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"52e44a998be1c2142f3874b5ff4a8f40902b1f922d23a5db895af0f402eb6218"} Mar 19 12:04:36.396185 master-0 kubenswrapper[7642]: I0319 12:04:36.396129 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:04:36.396185 master-0 kubenswrapper[7642]: I0319 12:04:36.396148 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:04:37.406843 master-0 kubenswrapper[7642]: I0319 12:04:37.406754 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/cluster-cloud-controller-manager/0.log" Mar 19 12:04:37.407761 master-0 kubenswrapper[7642]: I0319 12:04:37.406855 7642 generic.go:334] "Generic (PLEG): container finished" podID="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" containerID="79d5dba168afeffdccc6ff9b4390e9029f185749a05da6ead57ad3001b8c3183" exitCode=1 Mar 19 12:04:37.407761 master-0 kubenswrapper[7642]: I0319 12:04:37.406904 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerDied","Data":"79d5dba168afeffdccc6ff9b4390e9029f185749a05da6ead57ad3001b8c3183"} Mar 19 12:04:37.407980 master-0 kubenswrapper[7642]: I0319 12:04:37.407886 7642 scope.go:117] "RemoveContainer" containerID="79d5dba168afeffdccc6ff9b4390e9029f185749a05da6ead57ad3001b8c3183" Mar 19 12:04:38.419629 master-0 kubenswrapper[7642]: I0319 12:04:38.419546 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/cluster-cloud-controller-manager/0.log" Mar 19 12:04:38.420477 master-0 kubenswrapper[7642]: I0319 12:04:38.419673 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"a8fbcf038f8b67376900fffa9726e27e9af1c3c213a38a857a8441750c40f894"} Mar 19 12:04:40.441358 master-0 kubenswrapper[7642]: I0319 12:04:40.441225 7642 generic.go:334] "Generic (PLEG): container finished" podID="39f479b3-45ff-43f9-ae85-083554a54918" containerID="9d5a5757181aa7140885d0b2bdc37c14b8a849c385bb89cb00edfe5781ce2cdf" exitCode=0 Mar 19 12:04:40.441358 master-0 kubenswrapper[7642]: I0319 12:04:40.441307 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerDied","Data":"9d5a5757181aa7140885d0b2bdc37c14b8a849c385bb89cb00edfe5781ce2cdf"} Mar 19 12:04:40.442455 master-0 kubenswrapper[7642]: I0319 12:04:40.442069 7642 scope.go:117] "RemoveContainer" containerID="9d5a5757181aa7140885d0b2bdc37c14b8a849c385bb89cb00edfe5781ce2cdf" Mar 19 12:04:41.454094 master-0 kubenswrapper[7642]: I0319 12:04:41.454024 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerStarted","Data":"fbdc790b03bdff951c38ccc0431bdc0b9ab80a50a61e36054192578c5f30e4de"} Mar 19 12:04:41.455315 master-0 kubenswrapper[7642]: I0319 12:04:41.454481 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:04:41.458206 master-0 kubenswrapper[7642]: I0319 12:04:41.458105 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:04:42.467383 master-0 kubenswrapper[7642]: I0319 12:04:42.467306 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/config-sync-controllers/0.log" Mar 19 12:04:42.468455 master-0 kubenswrapper[7642]: I0319 12:04:42.468246 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/cluster-cloud-controller-manager/0.log" Mar 19 12:04:42.468455 master-0 kubenswrapper[7642]: I0319 12:04:42.468347 7642 generic.go:334] "Generic (PLEG): container finished" podID="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" containerID="1807478431eec6d4ebe9e175a729aeaa97be5330ccb6e1fb9986d071c23571b4" exitCode=1 Mar 19 12:04:42.468711 master-0 kubenswrapper[7642]: I0319 12:04:42.468494 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerDied","Data":"1807478431eec6d4ebe9e175a729aeaa97be5330ccb6e1fb9986d071c23571b4"} Mar 19 12:04:42.469222 master-0 kubenswrapper[7642]: I0319 12:04:42.469169 7642 scope.go:117] "RemoveContainer" containerID="1807478431eec6d4ebe9e175a729aeaa97be5330ccb6e1fb9986d071c23571b4" Mar 19 12:04:43.484842 master-0 kubenswrapper[7642]: I0319 12:04:43.484754 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/config-sync-controllers/0.log" Mar 19 12:04:43.485885 master-0 kubenswrapper[7642]: I0319 12:04:43.485834 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/cluster-cloud-controller-manager/0.log" Mar 19 12:04:43.486071 master-0 kubenswrapper[7642]: I0319 12:04:43.486000 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"31b268e0c623a15f5f95903154798432e7236486fabf05d30051e9c80ebbf93c"} Mar 19 12:04:48.528941 master-0 kubenswrapper[7642]: I0319 12:04:48.528845 7642 generic.go:334] "Generic (PLEG): container finished" podID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerID="4d73ea74c9f238cd74823cfff5517f40d001e3ed453b6b3cd60056853897aa07" exitCode=0 Mar 19 12:04:48.529924 master-0 kubenswrapper[7642]: I0319 12:04:48.528962 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerDied","Data":"4d73ea74c9f238cd74823cfff5517f40d001e3ed453b6b3cd60056853897aa07"} Mar 19 12:04:48.529924 master-0 kubenswrapper[7642]: I0319 12:04:48.529053 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d"} Mar 19 12:04:48.529924 master-0 kubenswrapper[7642]: I0319 12:04:48.529087 7642 scope.go:117] "RemoveContainer" containerID="12c2dd42c42f12d98f0581174f1fb6e51a5a2951700534c282b1aaa4cb3c5149" Mar 19 12:04:48.894786 master-0 kubenswrapper[7642]: E0319 12:04:48.894580 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="6.4s" Mar 19 12:04:48.968924 master-0 kubenswrapper[7642]: I0319 12:04:48.968844 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:04:48.971953 master-0 kubenswrapper[7642]: I0319 12:04:48.971873 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:48.971953 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:48.971953 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:48.971953 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:48.972216 master-0 kubenswrapper[7642]: I0319 12:04:48.971996 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:49.976798 master-0 kubenswrapper[7642]: I0319 12:04:49.975355 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:49.976798 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:49.976798 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:49.976798 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:49.976798 master-0 kubenswrapper[7642]: I0319 12:04:49.975487 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:50.972378 master-0 kubenswrapper[7642]: I0319 12:04:50.972186 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:50.972378 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:50.972378 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:50.972378 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:50.972378 master-0 kubenswrapper[7642]: I0319 12:04:50.972380 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:51.972440 master-0 kubenswrapper[7642]: I0319 12:04:51.972370 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:51.972440 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:51.972440 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:51.972440 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:51.973483 master-0 kubenswrapper[7642]: I0319 12:04:51.972478 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:52.559512 master-0 kubenswrapper[7642]: I0319 12:04:52.559440 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/0.log" Mar 19 12:04:52.559829 master-0 kubenswrapper[7642]: I0319 12:04:52.559521 7642 generic.go:334] "Generic (PLEG): container finished" podID="0c804ea0-3483-4506-b138-3bf30016d8d8" containerID="53038deea78787ba444b0934c52753748ca5659dc5194e58a710a65971f4c475" exitCode=1 Mar 19 12:04:52.559829 master-0 kubenswrapper[7642]: I0319 12:04:52.559603 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerDied","Data":"53038deea78787ba444b0934c52753748ca5659dc5194e58a710a65971f4c475"} Mar 19 12:04:52.560424 master-0 kubenswrapper[7642]: I0319 12:04:52.560373 7642 scope.go:117] "RemoveContainer" containerID="53038deea78787ba444b0934c52753748ca5659dc5194e58a710a65971f4c475" Mar 19 12:04:52.562140 master-0 kubenswrapper[7642]: I0319 12:04:52.561677 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-4np9g_58a4e3af-c5e1-4487-92a2-81dfb696695e/manager/0.log" Mar 19 12:04:52.562140 master-0 kubenswrapper[7642]: I0319 12:04:52.562041 7642 generic.go:334] "Generic (PLEG): container finished" podID="58a4e3af-c5e1-4487-92a2-81dfb696695e" containerID="3d22a1705e322b07bf63dc93baa238f057583ee120dd5b2d2d08b50e9d7a1b09" exitCode=1 Mar 19 12:04:52.562140 master-0 kubenswrapper[7642]: I0319 12:04:52.562120 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerDied","Data":"3d22a1705e322b07bf63dc93baa238f057583ee120dd5b2d2d08b50e9d7a1b09"} Mar 19 12:04:52.562852 master-0 kubenswrapper[7642]: I0319 12:04:52.562804 7642 scope.go:117] "RemoveContainer" containerID="3d22a1705e322b07bf63dc93baa238f057583ee120dd5b2d2d08b50e9d7a1b09" Mar 19 12:04:52.564953 master-0 kubenswrapper[7642]: I0319 12:04:52.564917 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-vxsl4_60081e3b-4fe5-42a3-bfae-0b07853c0e36/manager/0.log" Mar 19 12:04:52.565026 master-0 kubenswrapper[7642]: I0319 12:04:52.564955 7642 generic.go:334] "Generic (PLEG): container finished" podID="60081e3b-4fe5-42a3-bfae-0b07853c0e36" containerID="fbe4d38a848a8f8c8ac70c6aaafe195a243cd98f3045f416707deb69b549d12e" exitCode=1 Mar 19 12:04:52.565026 master-0 kubenswrapper[7642]: I0319 12:04:52.564986 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerDied","Data":"fbe4d38a848a8f8c8ac70c6aaafe195a243cd98f3045f416707deb69b549d12e"} Mar 19 12:04:52.565603 master-0 kubenswrapper[7642]: I0319 12:04:52.565504 7642 scope.go:117] "RemoveContainer" containerID="fbe4d38a848a8f8c8ac70c6aaafe195a243cd98f3045f416707deb69b549d12e" Mar 19 12:04:52.969790 master-0 kubenswrapper[7642]: I0319 12:04:52.969565 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:04:52.973114 master-0 kubenswrapper[7642]: I0319 12:04:52.973062 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:52.973114 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:52.973114 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:52.973114 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:52.973947 master-0 kubenswrapper[7642]: I0319 12:04:52.973154 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:53.589385 master-0 kubenswrapper[7642]: I0319 12:04:53.589297 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-vxsl4_60081e3b-4fe5-42a3-bfae-0b07853c0e36/manager/0.log" Mar 19 12:04:53.589956 master-0 kubenswrapper[7642]: I0319 12:04:53.589450 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"6a205ee3ed4a769ac518763e05a3f6447d834f56c960144a073306636619a961"} Mar 19 12:04:53.590069 master-0 kubenswrapper[7642]: I0319 12:04:53.589974 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:04:53.592925 master-0 kubenswrapper[7642]: I0319 12:04:53.592859 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/0.log" Mar 19 12:04:53.593250 master-0 kubenswrapper[7642]: I0319 12:04:53.592941 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"950cc7d342e464c9618916dd6bab4b9166b6d0a8be1aaabd0b32d4831ce1a7b5"} Mar 19 12:04:53.597816 master-0 kubenswrapper[7642]: I0319 12:04:53.597779 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-4np9g_58a4e3af-c5e1-4487-92a2-81dfb696695e/manager/0.log" Mar 19 12:04:53.598311 master-0 kubenswrapper[7642]: I0319 12:04:53.598262 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"d82d805964cbdcb6ff5d3bc38ec895ff8e1b88cc59ea6f6156f6d90acf9375a4"} Mar 19 12:04:53.598925 master-0 kubenswrapper[7642]: I0319 12:04:53.598876 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:04:53.972384 master-0 kubenswrapper[7642]: I0319 12:04:53.972287 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:53.972384 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:53.972384 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:53.972384 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:53.972902 master-0 kubenswrapper[7642]: I0319 12:04:53.972406 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:54.971986 master-0 kubenswrapper[7642]: I0319 12:04:54.971888 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:54.971986 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:54.971986 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:54.971986 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:54.972969 master-0 kubenswrapper[7642]: I0319 12:04:54.972011 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:55.972500 master-0 kubenswrapper[7642]: I0319 12:04:55.972302 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:55.972500 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:55.972500 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:55.972500 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:55.972500 master-0 kubenswrapper[7642]: I0319 12:04:55.972389 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:56.973536 master-0 kubenswrapper[7642]: I0319 12:04:56.972932 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:56.973536 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:56.973536 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:56.973536 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:56.973536 master-0 kubenswrapper[7642]: I0319 12:04:56.973062 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:57.973813 master-0 kubenswrapper[7642]: I0319 12:04:57.973701 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:57.973813 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:57.973813 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:57.973813 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:57.974939 master-0 kubenswrapper[7642]: I0319 12:04:57.973838 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:58.972714 master-0 kubenswrapper[7642]: I0319 12:04:58.972585 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:58.972714 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:58.972714 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:58.972714 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:58.973266 master-0 kubenswrapper[7642]: I0319 12:04:58.972711 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:04:59.972678 master-0 kubenswrapper[7642]: I0319 12:04:59.972514 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:04:59.972678 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:04:59.972678 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:04:59.972678 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:04:59.974354 master-0 kubenswrapper[7642]: I0319 12:04:59.972730 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:00.166181 master-0 kubenswrapper[7642]: E0319 12:05:00.165787 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:04:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:04:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:04:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:04:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded" Mar 19 12:05:00.973270 master-0 kubenswrapper[7642]: I0319 12:05:00.973164 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:00.973270 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:00.973270 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:00.973270 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:00.974530 master-0 kubenswrapper[7642]: I0319 12:05:00.973274 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:01.973248 master-0 kubenswrapper[7642]: I0319 12:05:01.973140 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:01.973248 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:01.973248 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:01.973248 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:01.974074 master-0 kubenswrapper[7642]: I0319 12:05:01.973288 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:02.421025 master-0 kubenswrapper[7642]: I0319 12:05:02.420937 7642 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Mar 19 12:05:02.972273 master-0 kubenswrapper[7642]: I0319 12:05:02.972157 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:02.972273 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:02.972273 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:02.972273 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:02.972273 master-0 kubenswrapper[7642]: I0319 12:05:02.972261 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:03.972594 master-0 kubenswrapper[7642]: I0319 12:05:03.972497 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:03.972594 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:03.972594 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:03.972594 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:03.973571 master-0 kubenswrapper[7642]: I0319 12:05:03.972593 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:04.388509 master-0 kubenswrapper[7642]: I0319 12:05:04.388224 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:05:04.452377 master-0 kubenswrapper[7642]: I0319 12:05:04.452274 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:05:04.971213 master-0 kubenswrapper[7642]: I0319 12:05:04.971124 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:04.971213 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:04.971213 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:04.971213 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:04.971213 master-0 kubenswrapper[7642]: I0319 12:05:04.971204 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:05.295848 master-0 kubenswrapper[7642]: E0319 12:05:05.295685 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:05:05.370944 master-0 kubenswrapper[7642]: E0319 12:05:05.370763 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-master-0.189e3c4f80f28615 openshift-kube-controller-manager 12770 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-master-0,UID:91ced7d9a1e8a2019ac2b6a262f0d19e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:00:15 +0000 UTC,LastTimestamp:2026-03-19 12:03:02.677009555 +0000 UTC m=+540.794450342,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:05:05.972908 master-0 kubenswrapper[7642]: I0319 12:05:05.972792 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:05.972908 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:05.972908 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:05.972908 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:05.973409 master-0 kubenswrapper[7642]: I0319 12:05:05.972939 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:06.589415 master-0 kubenswrapper[7642]: I0319 12:05:06.589325 7642 scope.go:117] "RemoveContainer" containerID="0e6c2dae4c28b6bf9d3115747fe713bb797e4eb746fc209c257ace61169519c1" Mar 19 12:05:06.972694 master-0 kubenswrapper[7642]: I0319 12:05:06.972556 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:06.972694 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:06.972694 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:06.972694 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:06.973326 master-0 kubenswrapper[7642]: I0319 12:05:06.972780 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:07.971993 master-0 kubenswrapper[7642]: I0319 12:05:07.971923 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:07.971993 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:07.971993 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:07.971993 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:07.973027 master-0 kubenswrapper[7642]: I0319 12:05:07.972003 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:08.972581 master-0 kubenswrapper[7642]: I0319 12:05:08.972478 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:08.972581 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:08.972581 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:08.972581 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:08.972581 master-0 kubenswrapper[7642]: I0319 12:05:08.972571 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:09.972349 master-0 kubenswrapper[7642]: I0319 12:05:09.972281 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:09.972349 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:09.972349 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:09.972349 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:09.973793 master-0 kubenswrapper[7642]: I0319 12:05:09.973733 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:10.167046 master-0 kubenswrapper[7642]: E0319 12:05:10.166970 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:10.399709 master-0 kubenswrapper[7642]: E0319 12:05:10.399596 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:10.748227 master-0 kubenswrapper[7642]: I0319 12:05:10.748084 7642 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="fc11d745bcb55f0325836aa72854476958c768ae4522a0e6c750cb0c84bb6fcd" exitCode=0 Mar 19 12:05:10.748227 master-0 kubenswrapper[7642]: I0319 12:05:10.748170 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"fc11d745bcb55f0325836aa72854476958c768ae4522a0e6c750cb0c84bb6fcd"} Mar 19 12:05:10.748592 master-0 kubenswrapper[7642]: I0319 12:05:10.748522 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:05:10.748592 master-0 kubenswrapper[7642]: I0319 12:05:10.748540 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:05:10.972760 master-0 kubenswrapper[7642]: I0319 12:05:10.972535 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:10.972760 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:10.972760 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:10.972760 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:10.972760 master-0 kubenswrapper[7642]: I0319 12:05:10.972664 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:11.972412 master-0 kubenswrapper[7642]: I0319 12:05:11.972325 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:11.972412 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:11.972412 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:11.972412 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:11.972412 master-0 kubenswrapper[7642]: I0319 12:05:11.972406 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:12.973047 master-0 kubenswrapper[7642]: I0319 12:05:12.972927 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:12.973047 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:12.973047 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:12.973047 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:12.973047 master-0 kubenswrapper[7642]: I0319 12:05:12.973048 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:13.790979 master-0 kubenswrapper[7642]: I0319 12:05:13.790848 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:05:13.790979 master-0 kubenswrapper[7642]: I0319 12:05:13.790927 7642 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="413966a39cd5b87df5d6bdc132b164ee7acc47dac3064d44c6d8a65871b50be7" exitCode=0 Mar 19 12:05:13.790979 master-0 kubenswrapper[7642]: I0319 12:05:13.790975 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"413966a39cd5b87df5d6bdc132b164ee7acc47dac3064d44c6d8a65871b50be7"} Mar 19 12:05:13.791577 master-0 kubenswrapper[7642]: I0319 12:05:13.791552 7642 scope.go:117] "RemoveContainer" containerID="413966a39cd5b87df5d6bdc132b164ee7acc47dac3064d44c6d8a65871b50be7" Mar 19 12:05:13.972207 master-0 kubenswrapper[7642]: I0319 12:05:13.972134 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:13.972207 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:13.972207 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:13.972207 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:13.972483 master-0 kubenswrapper[7642]: I0319 12:05:13.972225 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:14.808342 master-0 kubenswrapper[7642]: I0319 12:05:14.808271 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:05:14.809297 master-0 kubenswrapper[7642]: I0319 12:05:14.808393 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"7029d235236370ccb17a42944f5aa603314c5ddfc1ade5a5c80098d5b99e76af"} Mar 19 12:05:14.973490 master-0 kubenswrapper[7642]: I0319 12:05:14.973370 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:14.973490 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:14.973490 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:14.973490 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:14.973490 master-0 kubenswrapper[7642]: I0319 12:05:14.973473 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:15.135286 master-0 kubenswrapper[7642]: I0319 12:05:15.135152 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:15.135286 master-0 kubenswrapper[7642]: I0319 12:05:15.135233 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:15.136355 master-0 kubenswrapper[7642]: I0319 12:05:15.136293 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" start-of-body= Mar 19 12:05:15.136445 master-0 kubenswrapper[7642]: I0319 12:05:15.136394 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 19 12:05:15.972322 master-0 kubenswrapper[7642]: I0319 12:05:15.972226 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:15.972322 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:15.972322 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:15.972322 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:15.974158 master-0 kubenswrapper[7642]: I0319 12:05:15.972334 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:16.973098 master-0 kubenswrapper[7642]: I0319 12:05:16.972973 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:16.973098 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:16.973098 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:16.973098 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:16.973098 master-0 kubenswrapper[7642]: I0319 12:05:16.973078 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:17.973359 master-0 kubenswrapper[7642]: I0319 12:05:17.973246 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:17.973359 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:17.973359 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:17.973359 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:17.974592 master-0 kubenswrapper[7642]: I0319 12:05:17.973394 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:18.971863 master-0 kubenswrapper[7642]: I0319 12:05:18.971773 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:18.971863 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:18.971863 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:18.971863 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:18.972373 master-0 kubenswrapper[7642]: I0319 12:05:18.971876 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:19.972918 master-0 kubenswrapper[7642]: I0319 12:05:19.972792 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:19.972918 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:19.972918 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:19.972918 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:19.973989 master-0 kubenswrapper[7642]: I0319 12:05:19.972925 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:20.168033 master-0 kubenswrapper[7642]: E0319 12:05:20.167822 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:20.861589 master-0 kubenswrapper[7642]: I0319 12:05:20.861488 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-9fvr8_d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/control-plane-machine-set-operator/0.log" Mar 19 12:05:20.861938 master-0 kubenswrapper[7642]: I0319 12:05:20.861599 7642 generic.go:334] "Generic (PLEG): container finished" podID="d52981e0-0bc4-4dfa-81d5-aeab63fa23e0" containerID="064b176871fcbc879687ac7e9904eb1df9dbafb853b4d8ce41a9bbb9767a5a2c" exitCode=1 Mar 19 12:05:20.861938 master-0 kubenswrapper[7642]: I0319 12:05:20.861732 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerDied","Data":"064b176871fcbc879687ac7e9904eb1df9dbafb853b4d8ce41a9bbb9767a5a2c"} Mar 19 12:05:20.862568 master-0 kubenswrapper[7642]: I0319 12:05:20.862517 7642 scope.go:117] "RemoveContainer" containerID="064b176871fcbc879687ac7e9904eb1df9dbafb853b4d8ce41a9bbb9767a5a2c" Mar 19 12:05:20.972456 master-0 kubenswrapper[7642]: I0319 12:05:20.972368 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:20.972456 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:20.972456 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:20.972456 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:20.972825 master-0 kubenswrapper[7642]: I0319 12:05:20.972486 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:21.869584 master-0 kubenswrapper[7642]: I0319 12:05:21.869506 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-9fvr8_d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/control-plane-machine-set-operator/0.log" Mar 19 12:05:21.870578 master-0 kubenswrapper[7642]: I0319 12:05:21.869595 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerStarted","Data":"87a98c9fff8805e14bb0e09d73eb4c758fd6b46ec3819661dced16f114d43480"} Mar 19 12:05:21.972188 master-0 kubenswrapper[7642]: I0319 12:05:21.972074 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:21.972188 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:21.972188 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:21.972188 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:21.972484 master-0 kubenswrapper[7642]: I0319 12:05:21.972201 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:22.297285 master-0 kubenswrapper[7642]: E0319 12:05:22.297179 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:05:22.884799 master-0 kubenswrapper[7642]: I0319 12:05:22.884734 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-ftz5b_94fa5656-0561-45ab-9ab1-a6b96b8a47d4/package-server-manager/0.log" Mar 19 12:05:22.885537 master-0 kubenswrapper[7642]: I0319 12:05:22.885452 7642 generic.go:334] "Generic (PLEG): container finished" podID="94fa5656-0561-45ab-9ab1-a6b96b8a47d4" containerID="344d526c0d5c832dab03c63929de32c376b56407a1d6deb8f5c0fba230eec603" exitCode=1 Mar 19 12:05:22.885602 master-0 kubenswrapper[7642]: I0319 12:05:22.885532 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerDied","Data":"344d526c0d5c832dab03c63929de32c376b56407a1d6deb8f5c0fba230eec603"} Mar 19 12:05:22.886247 master-0 kubenswrapper[7642]: I0319 12:05:22.886211 7642 scope.go:117] "RemoveContainer" containerID="344d526c0d5c832dab03c63929de32c376b56407a1d6deb8f5c0fba230eec603" Mar 19 12:05:22.887839 master-0 kubenswrapper[7642]: I0319 12:05:22.887804 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/1.log" Mar 19 12:05:22.888219 master-0 kubenswrapper[7642]: I0319 12:05:22.888204 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/0.log" Mar 19 12:05:22.888284 master-0 kubenswrapper[7642]: I0319 12:05:22.888251 7642 generic.go:334] "Generic (PLEG): container finished" podID="0c804ea0-3483-4506-b138-3bf30016d8d8" containerID="950cc7d342e464c9618916dd6bab4b9166b6d0a8be1aaabd0b32d4831ce1a7b5" exitCode=1 Mar 19 12:05:22.888321 master-0 kubenswrapper[7642]: I0319 12:05:22.888292 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerDied","Data":"950cc7d342e464c9618916dd6bab4b9166b6d0a8be1aaabd0b32d4831ce1a7b5"} Mar 19 12:05:22.888390 master-0 kubenswrapper[7642]: I0319 12:05:22.888367 7642 scope.go:117] "RemoveContainer" containerID="53038deea78787ba444b0934c52753748ca5659dc5194e58a710a65971f4c475" Mar 19 12:05:22.888832 master-0 kubenswrapper[7642]: I0319 12:05:22.888797 7642 scope.go:117] "RemoveContainer" containerID="950cc7d342e464c9618916dd6bab4b9166b6d0a8be1aaabd0b32d4831ce1a7b5" Mar 19 12:05:22.889058 master-0 kubenswrapper[7642]: E0319 12:05:22.889021 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:05:22.972591 master-0 kubenswrapper[7642]: I0319 12:05:22.972542 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:22.972591 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:22.972591 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:22.972591 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:22.972763 master-0 kubenswrapper[7642]: I0319 12:05:22.972625 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:23.898056 master-0 kubenswrapper[7642]: I0319 12:05:23.897990 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/1.log" Mar 19 12:05:23.900532 master-0 kubenswrapper[7642]: I0319 12:05:23.900483 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-b2w5m_81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/machine-approver-controller/0.log" Mar 19 12:05:23.901140 master-0 kubenswrapper[7642]: I0319 12:05:23.901091 7642 generic.go:334] "Generic (PLEG): container finished" podID="81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a" containerID="7fc5e15dbb622b807625a944b8961634a70f1aecfe4e4c996f59e0fe79999af9" exitCode=255 Mar 19 12:05:23.901217 master-0 kubenswrapper[7642]: I0319 12:05:23.901135 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerDied","Data":"7fc5e15dbb622b807625a944b8961634a70f1aecfe4e4c996f59e0fe79999af9"} Mar 19 12:05:23.902744 master-0 kubenswrapper[7642]: I0319 12:05:23.902708 7642 scope.go:117] "RemoveContainer" containerID="7fc5e15dbb622b807625a944b8961634a70f1aecfe4e4c996f59e0fe79999af9" Mar 19 12:05:23.904029 master-0 kubenswrapper[7642]: I0319 12:05:23.903964 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-ftz5b_94fa5656-0561-45ab-9ab1-a6b96b8a47d4/package-server-manager/0.log" Mar 19 12:05:23.905213 master-0 kubenswrapper[7642]: I0319 12:05:23.905170 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"2aca712aa2aa40b581cebaa07826a29d6ba9e6eed682d665dfe4f86d2005740f"} Mar 19 12:05:23.905509 master-0 kubenswrapper[7642]: I0319 12:05:23.905444 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:05:23.973137 master-0 kubenswrapper[7642]: I0319 12:05:23.973029 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:23.973137 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:23.973137 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:23.973137 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:23.973734 master-0 kubenswrapper[7642]: I0319 12:05:23.973149 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:24.919169 master-0 kubenswrapper[7642]: I0319 12:05:24.919085 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8e27b7d086edf5d2cf47b703574641d8/kube-scheduler/0.log" Mar 19 12:05:24.920383 master-0 kubenswrapper[7642]: I0319 12:05:24.919968 7642 generic.go:334] "Generic (PLEG): container finished" podID="8e27b7d086edf5d2cf47b703574641d8" containerID="f043145314a3afe735f976ccbd7c9f8a4f3dcf66cb61565878dcf3c6a12fe880" exitCode=1 Mar 19 12:05:24.920383 master-0 kubenswrapper[7642]: I0319 12:05:24.919990 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerDied","Data":"f043145314a3afe735f976ccbd7c9f8a4f3dcf66cb61565878dcf3c6a12fe880"} Mar 19 12:05:24.921252 master-0 kubenswrapper[7642]: I0319 12:05:24.921164 7642 scope.go:117] "RemoveContainer" containerID="f043145314a3afe735f976ccbd7c9f8a4f3dcf66cb61565878dcf3c6a12fe880" Mar 19 12:05:24.923700 master-0 kubenswrapper[7642]: I0319 12:05:24.923622 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-b2w5m_81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/machine-approver-controller/0.log" Mar 19 12:05:24.924858 master-0 kubenswrapper[7642]: I0319 12:05:24.924610 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"a23a6b5358beeaf5e645643ffb405a80fb4cf54ade8b5a4f38d294751cca2a76"} Mar 19 12:05:24.977805 master-0 kubenswrapper[7642]: I0319 12:05:24.977731 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:24.977805 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:24.977805 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:24.977805 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:24.978236 master-0 kubenswrapper[7642]: I0319 12:05:24.977833 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:25.138350 master-0 kubenswrapper[7642]: I0319 12:05:25.138272 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:05:25.138350 master-0 kubenswrapper[7642]: I0319 12:05:25.138334 7642 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:05:25.139094 master-0 kubenswrapper[7642]: I0319 12:05:25.139047 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" start-of-body= Mar 19 12:05:25.139159 master-0 kubenswrapper[7642]: I0319 12:05:25.139104 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 19 12:05:25.938051 master-0 kubenswrapper[7642]: I0319 12:05:25.937954 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8e27b7d086edf5d2cf47b703574641d8/kube-scheduler/0.log" Mar 19 12:05:25.939035 master-0 kubenswrapper[7642]: I0319 12:05:25.938499 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"3172c0e1310fe85b6969d859dd1e6cb260c3622246f9e1dccc9e19331599cb26"} Mar 19 12:05:25.971275 master-0 kubenswrapper[7642]: I0319 12:05:25.971169 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:25.971275 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:25.971275 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:25.971275 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:25.971275 master-0 kubenswrapper[7642]: I0319 12:05:25.971280 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:26.949623 master-0 kubenswrapper[7642]: I0319 12:05:26.949542 7642 generic.go:334] "Generic (PLEG): container finished" podID="3f3c0234-248a-4d6f-9aca-6582a481a911" containerID="a5cb97fd211a0ea441e72f2a15543fd5127539a45c24f3604c7d9c65dbb4100e" exitCode=0 Mar 19 12:05:26.950232 master-0 kubenswrapper[7642]: I0319 12:05:26.949656 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerDied","Data":"a5cb97fd211a0ea441e72f2a15543fd5127539a45c24f3604c7d9c65dbb4100e"} Mar 19 12:05:26.950232 master-0 kubenswrapper[7642]: I0319 12:05:26.949866 7642 scope.go:117] "RemoveContainer" containerID="b9318bc6665c43cc333e2980698d6c509e0e61367154c5acfc00fbbe7da6a9de" Mar 19 12:05:26.950389 master-0 kubenswrapper[7642]: I0319 12:05:26.950354 7642 scope.go:117] "RemoveContainer" containerID="a5cb97fd211a0ea441e72f2a15543fd5127539a45c24f3604c7d9c65dbb4100e" Mar 19 12:05:26.952254 master-0 kubenswrapper[7642]: I0319 12:05:26.952200 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/0.log" Mar 19 12:05:26.952316 master-0 kubenswrapper[7642]: I0319 12:05:26.952271 7642 generic.go:334] "Generic (PLEG): container finished" podID="e90c10f6-36d9-44bd-a4c6-174f12135434" containerID="cf537f49e2a1b4ab25a2740d2ec9131fff4692e5f665c069887b93bcb77ec276" exitCode=1 Mar 19 12:05:26.952388 master-0 kubenswrapper[7642]: I0319 12:05:26.952325 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerDied","Data":"cf537f49e2a1b4ab25a2740d2ec9131fff4692e5f665c069887b93bcb77ec276"} Mar 19 12:05:26.952651 master-0 kubenswrapper[7642]: I0319 12:05:26.952601 7642 scope.go:117] "RemoveContainer" containerID="cf537f49e2a1b4ab25a2740d2ec9131fff4692e5f665c069887b93bcb77ec276" Mar 19 12:05:26.955561 master-0 kubenswrapper[7642]: I0319 12:05:26.955505 7642 generic.go:334] "Generic (PLEG): container finished" podID="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" containerID="f7738588625037912affdf5435640d17413e9cd9036722fe4bd36750bce1dc90" exitCode=0 Mar 19 12:05:26.956574 master-0 kubenswrapper[7642]: I0319 12:05:26.956518 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerDied","Data":"f7738588625037912affdf5435640d17413e9cd9036722fe4bd36750bce1dc90"} Mar 19 12:05:26.956777 master-0 kubenswrapper[7642]: I0319 12:05:26.956734 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:05:26.957201 master-0 kubenswrapper[7642]: I0319 12:05:26.957165 7642 scope.go:117] "RemoveContainer" containerID="f7738588625037912affdf5435640d17413e9cd9036722fe4bd36750bce1dc90" Mar 19 12:05:26.972965 master-0 kubenswrapper[7642]: I0319 12:05:26.972907 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:26.972965 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:26.972965 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:26.972965 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:26.973210 master-0 kubenswrapper[7642]: I0319 12:05:26.972987 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:27.965156 master-0 kubenswrapper[7642]: I0319 12:05:27.965067 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"081b3f4684461729e6f698c3195bf7ea8483b6807bf70589a3fc1e634fdb1c16"} Mar 19 12:05:27.970261 master-0 kubenswrapper[7642]: I0319 12:05:27.970201 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/0.log" Mar 19 12:05:27.970467 master-0 kubenswrapper[7642]: I0319 12:05:27.970366 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a"} Mar 19 12:05:27.971834 master-0 kubenswrapper[7642]: I0319 12:05:27.971785 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:27.971834 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:27.971834 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:27.971834 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:27.972044 master-0 kubenswrapper[7642]: I0319 12:05:27.971870 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:27.973465 master-0 kubenswrapper[7642]: I0319 12:05:27.973419 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"4f882838c61c3b7b61dcc60feb2b4a86a11287b46f9c3c99faba9a5cff759b8d"} Mar 19 12:05:28.972364 master-0 kubenswrapper[7642]: I0319 12:05:28.972277 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:28.972364 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:28.972364 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:28.972364 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:28.972364 master-0 kubenswrapper[7642]: I0319 12:05:28.972341 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:29.972408 master-0 kubenswrapper[7642]: I0319 12:05:29.972146 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:29.972408 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:29.972408 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:29.972408 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:29.972408 master-0 kubenswrapper[7642]: I0319 12:05:29.972259 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:29.994394 master-0 kubenswrapper[7642]: I0319 12:05:29.994282 7642 generic.go:334] "Generic (PLEG): container finished" podID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerID="171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464" exitCode=0 Mar 19 12:05:29.994394 master-0 kubenswrapper[7642]: I0319 12:05:29.994379 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerDied","Data":"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464"} Mar 19 12:05:29.995300 master-0 kubenswrapper[7642]: I0319 12:05:29.995243 7642 scope.go:117] "RemoveContainer" containerID="171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464" Mar 19 12:05:30.168808 master-0 kubenswrapper[7642]: E0319 12:05:30.168734 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:30.973097 master-0 kubenswrapper[7642]: I0319 12:05:30.973017 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:30.973097 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:30.973097 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:30.973097 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:30.973949 master-0 kubenswrapper[7642]: I0319 12:05:30.973111 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:31.008118 master-0 kubenswrapper[7642]: I0319 12:05:31.008034 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerStarted","Data":"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf"} Mar 19 12:05:31.008930 master-0 kubenswrapper[7642]: I0319 12:05:31.008681 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:05:31.014079 master-0 kubenswrapper[7642]: I0319 12:05:31.014038 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:05:31.971918 master-0 kubenswrapper[7642]: I0319 12:05:31.971808 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:31.971918 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:31.971918 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:31.971918 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:31.972424 master-0 kubenswrapper[7642]: I0319 12:05:31.971940 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:32.971871 master-0 kubenswrapper[7642]: I0319 12:05:32.971819 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:32.971871 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:32.971871 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:32.971871 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:32.972709 master-0 kubenswrapper[7642]: I0319 12:05:32.972672 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:33.971268 master-0 kubenswrapper[7642]: I0319 12:05:33.971155 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:33.971268 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:33.971268 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:33.971268 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:33.971268 master-0 kubenswrapper[7642]: I0319 12:05:33.971247 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:34.972243 master-0 kubenswrapper[7642]: I0319 12:05:34.972149 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:34.972243 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:34.972243 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:34.972243 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:34.973393 master-0 kubenswrapper[7642]: I0319 12:05:34.972245 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:35.067718 master-0 kubenswrapper[7642]: I0319 12:05:35.067612 7642 scope.go:117] "RemoveContainer" containerID="950cc7d342e464c9618916dd6bab4b9166b6d0a8be1aaabd0b32d4831ce1a7b5" Mar 19 12:05:35.972536 master-0 kubenswrapper[7642]: I0319 12:05:35.972404 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:35.972536 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:35.972536 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:35.972536 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:35.972536 master-0 kubenswrapper[7642]: I0319 12:05:35.972503 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:36.046480 master-0 kubenswrapper[7642]: I0319 12:05:36.046410 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/1.log" Mar 19 12:05:36.046770 master-0 kubenswrapper[7642]: I0319 12:05:36.046493 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48"} Mar 19 12:05:36.974379 master-0 kubenswrapper[7642]: I0319 12:05:36.974290 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:36.974379 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:36.974379 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:36.974379 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:36.975253 master-0 kubenswrapper[7642]: I0319 12:05:36.974392 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:37.972205 master-0 kubenswrapper[7642]: I0319 12:05:37.972119 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:37.972205 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:37.972205 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:37.972205 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:37.972551 master-0 kubenswrapper[7642]: I0319 12:05:37.972211 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:38.135974 master-0 kubenswrapper[7642]: I0319 12:05:38.135837 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:05:38.136630 master-0 kubenswrapper[7642]: I0319 12:05:38.135994 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:38.136630 master-0 kubenswrapper[7642]: I0319 12:05:38.136102 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:38.137193 master-0 kubenswrapper[7642]: I0319 12:05:38.137133 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"7029d235236370ccb17a42944f5aa603314c5ddfc1ade5a5c80098d5b99e76af"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 12:05:38.137383 master-0 kubenswrapper[7642]: I0319 12:05:38.137304 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" containerID="cri-o://7029d235236370ccb17a42944f5aa603314c5ddfc1ade5a5c80098d5b99e76af" gracePeriod=30 Mar 19 12:05:38.971759 master-0 kubenswrapper[7642]: I0319 12:05:38.971665 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:38.971759 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:38.971759 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:38.971759 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:38.971759 master-0 kubenswrapper[7642]: I0319 12:05:38.971762 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:39.071442 master-0 kubenswrapper[7642]: I0319 12:05:39.071376 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/1.log" Mar 19 12:05:39.074457 master-0 kubenswrapper[7642]: I0319 12:05:39.074432 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:05:39.074580 master-0 kubenswrapper[7642]: I0319 12:05:39.074559 7642 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="7029d235236370ccb17a42944f5aa603314c5ddfc1ade5a5c80098d5b99e76af" exitCode=255 Mar 19 12:05:39.074772 master-0 kubenswrapper[7642]: I0319 12:05:39.074706 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"7029d235236370ccb17a42944f5aa603314c5ddfc1ade5a5c80098d5b99e76af"} Mar 19 12:05:39.074861 master-0 kubenswrapper[7642]: I0319 12:05:39.074827 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"8c823395d65117957b1dd6c27405e0e71b5374cf918b04d641ac0497b20abd8b"} Mar 19 12:05:39.074919 master-0 kubenswrapper[7642]: I0319 12:05:39.074894 7642 scope.go:117] "RemoveContainer" containerID="413966a39cd5b87df5d6bdc132b164ee7acc47dac3064d44c6d8a65871b50be7" Mar 19 12:05:39.299190 master-0 kubenswrapper[7642]: E0319 12:05:39.299038 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:05:39.374130 master-0 kubenswrapper[7642]: E0319 12:05:39.373937 7642 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{kube-controller-manager-master-0.189e3c4f819fe361 openshift-kube-controller-manager 12771 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-master-0,UID:91ced7d9a1e8a2019ac2b6a262f0d19e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:00:15 +0000 UTC,LastTimestamp:2026-03-19 12:03:02.691040722 +0000 UTC m=+540.808481509,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:05:39.972774 master-0 kubenswrapper[7642]: I0319 12:05:39.972691 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:39.972774 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:39.972774 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:39.972774 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:39.973393 master-0 kubenswrapper[7642]: I0319 12:05:39.973347 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:40.084709 master-0 kubenswrapper[7642]: I0319 12:05:40.084600 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/1.log" Mar 19 12:05:40.086503 master-0 kubenswrapper[7642]: I0319 12:05:40.086455 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:05:40.170174 master-0 kubenswrapper[7642]: E0319 12:05:40.169924 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:40.170174 master-0 kubenswrapper[7642]: E0319 12:05:40.169990 7642 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 12:05:40.972767 master-0 kubenswrapper[7642]: I0319 12:05:40.972690 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:40.972767 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:40.972767 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:40.972767 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:40.973347 master-0 kubenswrapper[7642]: I0319 12:05:40.972778 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:41.971670 master-0 kubenswrapper[7642]: I0319 12:05:41.971579 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:41.971670 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:41.971670 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:41.971670 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:41.972047 master-0 kubenswrapper[7642]: I0319 12:05:41.971679 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:42.972506 master-0 kubenswrapper[7642]: I0319 12:05:42.972395 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:42.972506 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:42.972506 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:42.972506 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:42.972506 master-0 kubenswrapper[7642]: I0319 12:05:42.972498 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:43.972384 master-0 kubenswrapper[7642]: I0319 12:05:43.972256 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:43.972384 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:43.972384 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:43.972384 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:43.972384 master-0 kubenswrapper[7642]: I0319 12:05:43.972368 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:44.751462 master-0 kubenswrapper[7642]: E0319 12:05:44.751382 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:44.972437 master-0 kubenswrapper[7642]: I0319 12:05:44.972353 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:44.972437 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:44.972437 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:44.972437 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:44.973365 master-0 kubenswrapper[7642]: I0319 12:05:44.972440 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:45.130687 master-0 kubenswrapper[7642]: I0319 12:05:45.130604 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"3cc7ba3003b9018ab0a4e14b7866663c2d4ae06af289a9d62d0178f2b00749cf"} Mar 19 12:05:45.134881 master-0 kubenswrapper[7642]: I0319 12:05:45.134832 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:45.135392 master-0 kubenswrapper[7642]: I0319 12:05:45.135345 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:05:45.971788 master-0 kubenswrapper[7642]: I0319 12:05:45.971677 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:45.971788 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:45.971788 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:45.971788 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:45.971788 master-0 kubenswrapper[7642]: I0319 12:05:45.971771 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:46.148441 master-0 kubenswrapper[7642]: I0319 12:05:46.148305 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"4bebd688c15bc9924a6633e91d629de13286a79430204ae499ef583b4192829b"} Mar 19 12:05:46.148441 master-0 kubenswrapper[7642]: I0319 12:05:46.148370 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"9fb4057cedb716012c53d5b5e73188baaa2d84c2bfe9a5b0b41a9d914c68c9a9"} Mar 19 12:05:46.148441 master-0 kubenswrapper[7642]: I0319 12:05:46.148386 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"006c51c13746449d97f89fd32f5c66c1fd844fcadae3ad681b363c4e8d85d73a"} Mar 19 12:05:46.148441 master-0 kubenswrapper[7642]: I0319 12:05:46.148395 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"afb68f1312817edfafcc486849026e0a22ee56001daacaa9fb327878eb9fdf35"} Mar 19 12:05:46.160394 master-0 kubenswrapper[7642]: I0319 12:05:46.150818 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:05:46.160394 master-0 kubenswrapper[7642]: I0319 12:05:46.150870 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:05:46.972995 master-0 kubenswrapper[7642]: I0319 12:05:46.972863 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:46.972995 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:46.972995 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:46.972995 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:46.973499 master-0 kubenswrapper[7642]: I0319 12:05:46.973045 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:47.971426 master-0 kubenswrapper[7642]: I0319 12:05:47.971355 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:47.971426 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:47.971426 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:47.971426 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:47.972536 master-0 kubenswrapper[7642]: I0319 12:05:47.971511 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:48.135702 master-0 kubenswrapper[7642]: I0319 12:05:48.135537 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:05:48.136018 master-0 kubenswrapper[7642]: I0319 12:05:48.135717 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:48.972837 master-0 kubenswrapper[7642]: I0319 12:05:48.972693 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:48.972837 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:48.972837 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:48.972837 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:48.972837 master-0 kubenswrapper[7642]: I0319 12:05:48.972799 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:49.972933 master-0 kubenswrapper[7642]: I0319 12:05:49.972798 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:49.972933 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:49.972933 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:49.972933 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:49.973780 master-0 kubenswrapper[7642]: I0319 12:05:49.972949 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:50.972994 master-0 kubenswrapper[7642]: I0319 12:05:50.972876 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:50.972994 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:50.972994 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:50.972994 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:50.972994 master-0 kubenswrapper[7642]: I0319 12:05:50.972972 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:51.093785 master-0 kubenswrapper[7642]: I0319 12:05:51.093632 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:51.093785 master-0 kubenswrapper[7642]: I0319 12:05:51.093745 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 12:05:51.972735 master-0 kubenswrapper[7642]: I0319 12:05:51.972670 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:51.972735 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:51.972735 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:51.972735 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:51.973762 master-0 kubenswrapper[7642]: I0319 12:05:51.972763 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:52.972173 master-0 kubenswrapper[7642]: I0319 12:05:52.972056 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:52.972173 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:52.972173 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:52.972173 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:52.972735 master-0 kubenswrapper[7642]: I0319 12:05:52.972276 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:53.972432 master-0 kubenswrapper[7642]: I0319 12:05:53.972341 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:53.972432 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:53.972432 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:53.972432 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:53.973010 master-0 kubenswrapper[7642]: I0319 12:05:53.972472 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:54.973100 master-0 kubenswrapper[7642]: I0319 12:05:54.972997 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:54.973100 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:54.973100 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:54.973100 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:54.974100 master-0 kubenswrapper[7642]: I0319 12:05:54.973099 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:55.971801 master-0 kubenswrapper[7642]: I0319 12:05:55.971739 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:55.971801 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:55.971801 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:55.971801 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:55.972533 master-0 kubenswrapper[7642]: I0319 12:05:55.972474 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:56.307895 master-0 kubenswrapper[7642]: E0319 12:05:56.305693 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="7s" Mar 19 12:05:56.972314 master-0 kubenswrapper[7642]: I0319 12:05:56.972244 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:56.972314 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:56.972314 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:56.972314 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:56.973397 master-0 kubenswrapper[7642]: I0319 12:05:56.973355 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:57.247292 master-0 kubenswrapper[7642]: I0319 12:05:57.246890 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:05:57.971788 master-0 kubenswrapper[7642]: I0319 12:05:57.971596 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:57.971788 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:57.971788 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:57.971788 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:57.971788 master-0 kubenswrapper[7642]: I0319 12:05:57.971733 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:58.135022 master-0 kubenswrapper[7642]: I0319 12:05:58.134872 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:05:58.135365 master-0 kubenswrapper[7642]: I0319 12:05:58.135048 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:05:58.971904 master-0 kubenswrapper[7642]: I0319 12:05:58.971812 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:58.971904 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:58.971904 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:58.971904 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:58.972689 master-0 kubenswrapper[7642]: I0319 12:05:58.971926 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:05:59.971388 master-0 kubenswrapper[7642]: I0319 12:05:59.971321 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:05:59.971388 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:05:59.971388 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:05:59.971388 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:05:59.971733 master-0 kubenswrapper[7642]: I0319 12:05:59.971399 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:00.971396 master-0 kubenswrapper[7642]: I0319 12:06:00.971337 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:00.971396 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:00.971396 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:00.971396 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:00.972055 master-0 kubenswrapper[7642]: I0319 12:06:00.971421 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:01.119518 master-0 kubenswrapper[7642]: I0319 12:06:01.119434 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:01.973041 master-0 kubenswrapper[7642]: I0319 12:06:01.972954 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:01.973041 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:01.973041 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:01.973041 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:01.973606 master-0 kubenswrapper[7642]: I0319 12:06:01.973061 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:02.423005 master-0 kubenswrapper[7642]: I0319 12:06:02.422922 7642 status_manager.go:851] "Failed to get status for pod" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" pod="openshift-etcd/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Mar 19 12:06:02.972785 master-0 kubenswrapper[7642]: I0319 12:06:02.972710 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:02.972785 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:02.972785 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:02.972785 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:02.973103 master-0 kubenswrapper[7642]: I0319 12:06:02.972811 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:03.972517 master-0 kubenswrapper[7642]: I0319 12:06:03.972458 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:03.972517 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:03.972517 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:03.972517 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:03.973300 master-0 kubenswrapper[7642]: I0319 12:06:03.973246 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:04.972624 master-0 kubenswrapper[7642]: I0319 12:06:04.972538 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:04.972624 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:04.972624 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:04.972624 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:04.973145 master-0 kubenswrapper[7642]: I0319 12:06:04.972696 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:05.973448 master-0 kubenswrapper[7642]: I0319 12:06:05.973347 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:05.973448 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:05.973448 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:05.973448 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:05.974154 master-0 kubenswrapper[7642]: I0319 12:06:05.973477 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:06.120932 master-0 kubenswrapper[7642]: I0319 12:06:06.120865 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:06.332028 master-0 kubenswrapper[7642]: I0319 12:06:06.331899 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/2.log" Mar 19 12:06:06.332739 master-0 kubenswrapper[7642]: I0319 12:06:06.332620 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/1.log" Mar 19 12:06:06.332823 master-0 kubenswrapper[7642]: I0319 12:06:06.332761 7642 generic.go:334] "Generic (PLEG): container finished" podID="0c804ea0-3483-4506-b138-3bf30016d8d8" containerID="dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48" exitCode=1 Mar 19 12:06:06.332823 master-0 kubenswrapper[7642]: I0319 12:06:06.332809 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerDied","Data":"dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48"} Mar 19 12:06:06.332916 master-0 kubenswrapper[7642]: I0319 12:06:06.332858 7642 scope.go:117] "RemoveContainer" containerID="950cc7d342e464c9618916dd6bab4b9166b6d0a8be1aaabd0b32d4831ce1a7b5" Mar 19 12:06:06.333508 master-0 kubenswrapper[7642]: I0319 12:06:06.333453 7642 scope.go:117] "RemoveContainer" containerID="dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48" Mar 19 12:06:06.333829 master-0 kubenswrapper[7642]: E0319 12:06:06.333781 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:06:06.971869 master-0 kubenswrapper[7642]: I0319 12:06:06.971723 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:06.971869 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:06.971869 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:06.971869 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:06.971869 master-0 kubenswrapper[7642]: I0319 12:06:06.971819 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:07.339996 master-0 kubenswrapper[7642]: I0319 12:06:07.339878 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/2.log" Mar 19 12:06:07.973626 master-0 kubenswrapper[7642]: I0319 12:06:07.973522 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:07.973626 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:07.973626 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:07.973626 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:07.973626 master-0 kubenswrapper[7642]: I0319 12:06:07.973626 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:08.135352 master-0 kubenswrapper[7642]: I0319 12:06:08.135265 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:06:08.135352 master-0 kubenswrapper[7642]: I0319 12:06:08.135348 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:08.135706 master-0 kubenswrapper[7642]: I0319 12:06:08.135402 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:06:08.136264 master-0 kubenswrapper[7642]: I0319 12:06:08.136223 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"8c823395d65117957b1dd6c27405e0e71b5374cf918b04d641ac0497b20abd8b"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 12:06:08.136373 master-0 kubenswrapper[7642]: I0319 12:06:08.136337 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" containerID="cri-o://8c823395d65117957b1dd6c27405e0e71b5374cf918b04d641ac0497b20abd8b" gracePeriod=30 Mar 19 12:06:08.349293 master-0 kubenswrapper[7642]: I0319 12:06:08.349203 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/2.log" Mar 19 12:06:08.350171 master-0 kubenswrapper[7642]: I0319 12:06:08.350008 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/1.log" Mar 19 12:06:08.351808 master-0 kubenswrapper[7642]: I0319 12:06:08.351778 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:06:08.351887 master-0 kubenswrapper[7642]: I0319 12:06:08.351831 7642 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="8c823395d65117957b1dd6c27405e0e71b5374cf918b04d641ac0497b20abd8b" exitCode=255 Mar 19 12:06:08.351887 master-0 kubenswrapper[7642]: I0319 12:06:08.351869 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"8c823395d65117957b1dd6c27405e0e71b5374cf918b04d641ac0497b20abd8b"} Mar 19 12:06:08.352020 master-0 kubenswrapper[7642]: I0319 12:06:08.351908 7642 scope.go:117] "RemoveContainer" containerID="7029d235236370ccb17a42944f5aa603314c5ddfc1ade5a5c80098d5b99e76af" Mar 19 12:06:08.971323 master-0 kubenswrapper[7642]: I0319 12:06:08.971175 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:08.971323 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:08.971323 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:08.971323 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:08.971323 master-0 kubenswrapper[7642]: I0319 12:06:08.971269 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:09.364867 master-0 kubenswrapper[7642]: I0319 12:06:09.364704 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/2.log" Mar 19 12:06:09.367265 master-0 kubenswrapper[7642]: I0319 12:06:09.367224 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:06:09.367347 master-0 kubenswrapper[7642]: I0319 12:06:09.367306 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc"} Mar 19 12:06:09.972282 master-0 kubenswrapper[7642]: I0319 12:06:09.972223 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:09.972282 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:09.972282 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:09.972282 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:09.972616 master-0 kubenswrapper[7642]: I0319 12:06:09.972291 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:10.971486 master-0 kubenswrapper[7642]: I0319 12:06:10.971418 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:10.971486 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:10.971486 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:10.971486 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:10.972232 master-0 kubenswrapper[7642]: I0319 12:06:10.971501 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:11.971182 master-0 kubenswrapper[7642]: I0319 12:06:11.971083 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:11.971182 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:11.971182 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:11.971182 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:11.971182 master-0 kubenswrapper[7642]: I0319 12:06:11.971177 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:12.971765 master-0 kubenswrapper[7642]: I0319 12:06:12.971679 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:12.971765 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:12.971765 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:12.971765 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:12.972348 master-0 kubenswrapper[7642]: I0319 12:06:12.971804 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:13.307786 master-0 kubenswrapper[7642]: E0319 12:06:13.307588 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:06:13.972438 master-0 kubenswrapper[7642]: I0319 12:06:13.972359 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:13.972438 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:13.972438 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:13.972438 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:13.973081 master-0 kubenswrapper[7642]: I0319 12:06:13.972456 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:14.972169 master-0 kubenswrapper[7642]: I0319 12:06:14.972057 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:14.972169 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:14.972169 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:14.972169 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:14.972169 master-0 kubenswrapper[7642]: I0319 12:06:14.972151 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:15.135178 master-0 kubenswrapper[7642]: I0319 12:06:15.134931 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:06:15.135178 master-0 kubenswrapper[7642]: I0319 12:06:15.135022 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:06:15.971448 master-0 kubenswrapper[7642]: I0319 12:06:15.971351 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:15.971448 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:15.971448 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:15.971448 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:15.971448 master-0 kubenswrapper[7642]: I0319 12:06:15.971441 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:16.139005 master-0 kubenswrapper[7642]: I0319 12:06:16.138927 7642 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:06:16.139513 master-0 kubenswrapper[7642]: I0319 12:06:16.139031 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8e27b7d086edf5d2cf47b703574641d8" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:16.139513 master-0 kubenswrapper[7642]: I0319 12:06:16.139044 7642 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:06:16.139513 master-0 kubenswrapper[7642]: I0319 12:06:16.139230 7642 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="8e27b7d086edf5d2cf47b703574641d8" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:16.972390 master-0 kubenswrapper[7642]: I0319 12:06:16.972303 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:16.972390 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:16.972390 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:16.972390 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:16.972767 master-0 kubenswrapper[7642]: I0319 12:06:16.972414 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:17.971965 master-0 kubenswrapper[7642]: I0319 12:06:17.971893 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:17.971965 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:17.971965 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:17.971965 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:17.973055 master-0 kubenswrapper[7642]: I0319 12:06:17.971977 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:18.068802 master-0 kubenswrapper[7642]: I0319 12:06:18.068687 7642 scope.go:117] "RemoveContainer" containerID="dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48" Mar 19 12:06:18.069215 master-0 kubenswrapper[7642]: E0319 12:06:18.069145 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:06:18.136087 master-0 kubenswrapper[7642]: I0319 12:06:18.135978 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:06:18.136405 master-0 kubenswrapper[7642]: I0319 12:06:18.136088 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:18.971331 master-0 kubenswrapper[7642]: I0319 12:06:18.971096 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:18.971331 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:18.971331 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:18.971331 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:18.971757 master-0 kubenswrapper[7642]: I0319 12:06:18.971394 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:19.972205 master-0 kubenswrapper[7642]: I0319 12:06:19.972117 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:19.972205 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:19.972205 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:19.972205 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:19.973167 master-0 kubenswrapper[7642]: I0319 12:06:19.972230 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:20.153753 master-0 kubenswrapper[7642]: E0319 12:06:20.153659 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:20.450805 master-0 kubenswrapper[7642]: I0319 12:06:20.450727 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:06:20.450805 master-0 kubenswrapper[7642]: I0319 12:06:20.450781 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:06:20.972510 master-0 kubenswrapper[7642]: I0319 12:06:20.972440 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:20.972510 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:20.972510 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:20.972510 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:20.973744 master-0 kubenswrapper[7642]: I0319 12:06:20.972508 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:21.972366 master-0 kubenswrapper[7642]: I0319 12:06:21.972272 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:21.972366 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:21.972366 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:21.972366 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:21.973333 master-0 kubenswrapper[7642]: I0319 12:06:21.972386 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:22.971224 master-0 kubenswrapper[7642]: I0319 12:06:22.971112 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:22.971224 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:22.971224 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:22.971224 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:22.971593 master-0 kubenswrapper[7642]: I0319 12:06:22.971246 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:23.972461 master-0 kubenswrapper[7642]: I0319 12:06:23.972367 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:23.972461 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:23.972461 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:23.972461 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:23.973412 master-0 kubenswrapper[7642]: I0319 12:06:23.972488 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:24.972797 master-0 kubenswrapper[7642]: I0319 12:06:24.972717 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:24.972797 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:24.972797 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:24.972797 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:24.974109 master-0 kubenswrapper[7642]: I0319 12:06:24.972808 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:25.497595 master-0 kubenswrapper[7642]: I0319 12:06:25.497428 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:06:25.972483 master-0 kubenswrapper[7642]: I0319 12:06:25.972384 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:25.972483 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:25.972483 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:25.972483 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:25.973853 master-0 kubenswrapper[7642]: I0319 12:06:25.972504 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:26.971847 master-0 kubenswrapper[7642]: I0319 12:06:26.971746 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:26.971847 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:26.971847 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:26.971847 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:26.972511 master-0 kubenswrapper[7642]: I0319 12:06:26.971879 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:27.971472 master-0 kubenswrapper[7642]: I0319 12:06:27.971413 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:27.971472 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:27.971472 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:27.971472 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:27.972594 master-0 kubenswrapper[7642]: I0319 12:06:27.971478 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:28.135064 master-0 kubenswrapper[7642]: I0319 12:06:28.134919 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:06:28.135338 master-0 kubenswrapper[7642]: I0319 12:06:28.135099 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:28.973090 master-0 kubenswrapper[7642]: I0319 12:06:28.973012 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:28.973090 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:28.973090 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:28.973090 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:28.974231 master-0 kubenswrapper[7642]: I0319 12:06:28.973116 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:29.971325 master-0 kubenswrapper[7642]: I0319 12:06:29.971258 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:29.971325 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:29.971325 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:29.971325 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:29.972112 master-0 kubenswrapper[7642]: I0319 12:06:29.972052 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:30.309429 master-0 kubenswrapper[7642]: E0319 12:06:30.309126 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:06:30.972585 master-0 kubenswrapper[7642]: I0319 12:06:30.972479 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:30.972585 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:30.972585 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:30.972585 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:30.973354 master-0 kubenswrapper[7642]: I0319 12:06:30.973302 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:31.971594 master-0 kubenswrapper[7642]: I0319 12:06:31.971509 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:31.971594 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:31.971594 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:31.971594 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:31.972393 master-0 kubenswrapper[7642]: I0319 12:06:31.971612 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:32.972202 master-0 kubenswrapper[7642]: I0319 12:06:32.972092 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:32.972202 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:32.972202 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:32.972202 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:32.972202 master-0 kubenswrapper[7642]: I0319 12:06:32.972187 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:33.068886 master-0 kubenswrapper[7642]: I0319 12:06:33.068833 7642 scope.go:117] "RemoveContainer" containerID="dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48" Mar 19 12:06:33.568336 master-0 kubenswrapper[7642]: I0319 12:06:33.568137 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/2.log" Mar 19 12:06:33.568336 master-0 kubenswrapper[7642]: I0319 12:06:33.568242 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec"} Mar 19 12:06:33.972206 master-0 kubenswrapper[7642]: I0319 12:06:33.972060 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:33.972206 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:33.972206 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:33.972206 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:33.972206 master-0 kubenswrapper[7642]: I0319 12:06:33.972191 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:34.971863 master-0 kubenswrapper[7642]: I0319 12:06:34.971798 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:34.971863 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:34.971863 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:34.971863 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:34.972404 master-0 kubenswrapper[7642]: I0319 12:06:34.972360 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:35.972810 master-0 kubenswrapper[7642]: I0319 12:06:35.972694 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:35.972810 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:35.972810 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:35.972810 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:35.972810 master-0 kubenswrapper[7642]: I0319 12:06:35.972793 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:36.979414 master-0 kubenswrapper[7642]: I0319 12:06:36.979333 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:36.979414 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:36.979414 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:36.979414 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:36.980690 master-0 kubenswrapper[7642]: I0319 12:06:36.979441 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:37.972508 master-0 kubenswrapper[7642]: I0319 12:06:37.972419 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:37.972508 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:37.972508 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:37.972508 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:37.973039 master-0 kubenswrapper[7642]: I0319 12:06:37.972529 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:38.135206 master-0 kubenswrapper[7642]: I0319 12:06:38.135055 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:06:38.135206 master-0 kubenswrapper[7642]: I0319 12:06:38.135167 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:06:38.135206 master-0 kubenswrapper[7642]: I0319 12:06:38.135235 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:06:38.136471 master-0 kubenswrapper[7642]: I0319 12:06:38.136276 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 12:06:38.136471 master-0 kubenswrapper[7642]: I0319 12:06:38.136382 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" containerID="cri-o://0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" gracePeriod=30 Mar 19 12:06:38.272596 master-0 kubenswrapper[7642]: E0319 12:06:38.272535 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:06:38.614261 master-0 kubenswrapper[7642]: I0319 12:06:38.614113 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/3.log" Mar 19 12:06:38.614652 master-0 kubenswrapper[7642]: I0319 12:06:38.614600 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/2.log" Mar 19 12:06:38.616867 master-0 kubenswrapper[7642]: I0319 12:06:38.616829 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:06:38.616940 master-0 kubenswrapper[7642]: I0319 12:06:38.616886 7642 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" exitCode=255 Mar 19 12:06:38.616991 master-0 kubenswrapper[7642]: I0319 12:06:38.616951 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc"} Mar 19 12:06:38.617089 master-0 kubenswrapper[7642]: I0319 12:06:38.617048 7642 scope.go:117] "RemoveContainer" containerID="8c823395d65117957b1dd6c27405e0e71b5374cf918b04d641ac0497b20abd8b" Mar 19 12:06:38.617779 master-0 kubenswrapper[7642]: I0319 12:06:38.617738 7642 scope.go:117] "RemoveContainer" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" Mar 19 12:06:38.618029 master-0 kubenswrapper[7642]: E0319 12:06:38.617998 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:06:38.972650 master-0 kubenswrapper[7642]: I0319 12:06:38.972536 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:38.972650 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:38.972650 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:38.972650 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:38.973003 master-0 kubenswrapper[7642]: I0319 12:06:38.972679 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:39.627070 master-0 kubenswrapper[7642]: I0319 12:06:39.626991 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/3.log" Mar 19 12:06:39.629169 master-0 kubenswrapper[7642]: I0319 12:06:39.629123 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:06:39.972241 master-0 kubenswrapper[7642]: I0319 12:06:39.972191 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:39.972241 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:39.972241 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:39.972241 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:39.972687 master-0 kubenswrapper[7642]: I0319 12:06:39.972655 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:40.971960 master-0 kubenswrapper[7642]: I0319 12:06:40.971889 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:40.971960 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:40.971960 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:40.971960 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:40.973028 master-0 kubenswrapper[7642]: I0319 12:06:40.971972 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:41.972426 master-0 kubenswrapper[7642]: I0319 12:06:41.972336 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:41.972426 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:41.972426 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:41.972426 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:41.973700 master-0 kubenswrapper[7642]: I0319 12:06:41.972432 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:42.971948 master-0 kubenswrapper[7642]: I0319 12:06:42.971844 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:42.971948 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:42.971948 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:42.971948 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:42.972276 master-0 kubenswrapper[7642]: I0319 12:06:42.971977 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:43.971737 master-0 kubenswrapper[7642]: I0319 12:06:43.971628 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:43.971737 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:43.971737 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:43.971737 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:43.972497 master-0 kubenswrapper[7642]: I0319 12:06:43.971760 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:44.972034 master-0 kubenswrapper[7642]: I0319 12:06:44.971944 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:44.972034 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:44.972034 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:44.972034 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:44.972800 master-0 kubenswrapper[7642]: I0319 12:06:44.972084 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:45.135366 master-0 kubenswrapper[7642]: I0319 12:06:45.135275 7642 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:06:45.136534 master-0 kubenswrapper[7642]: I0319 12:06:45.136479 7642 scope.go:117] "RemoveContainer" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" Mar 19 12:06:45.137133 master-0 kubenswrapper[7642]: E0319 12:06:45.137075 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:06:45.972183 master-0 kubenswrapper[7642]: I0319 12:06:45.972122 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:45.972183 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:45.972183 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:45.972183 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:45.973001 master-0 kubenswrapper[7642]: I0319 12:06:45.972199 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:46.972814 master-0 kubenswrapper[7642]: I0319 12:06:46.972695 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:46.972814 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:46.972814 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:46.972814 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:46.973943 master-0 kubenswrapper[7642]: I0319 12:06:46.972827 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:47.310928 master-0 kubenswrapper[7642]: E0319 12:06:47.310572 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:06:47.972494 master-0 kubenswrapper[7642]: I0319 12:06:47.972427 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:06:47.972494 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:06:47.972494 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:06:47.972494 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:06:47.972869 master-0 kubenswrapper[7642]: I0319 12:06:47.972491 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:06:47.972869 master-0 kubenswrapper[7642]: I0319 12:06:47.972544 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:06:47.973312 master-0 kubenswrapper[7642]: I0319 12:06:47.973275 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d"} pod="openshift-ingress/router-default-7dcf5569b5-47tqf" containerMessage="Container router failed startup probe, will be restarted" Mar 19 12:06:47.973370 master-0 kubenswrapper[7642]: I0319 12:06:47.973328 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" containerID="cri-o://246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d" gracePeriod=3600 Mar 19 12:06:54.453679 master-0 kubenswrapper[7642]: E0319 12:06:54.453600 7642 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 19 12:06:57.760121 master-0 kubenswrapper[7642]: I0319 12:06:57.760031 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/2.log" Mar 19 12:06:57.761056 master-0 kubenswrapper[7642]: I0319 12:06:57.760543 7642 generic.go:334] "Generic (PLEG): container finished" podID="3f3c0234-248a-4d6f-9aca-6582a481a911" containerID="081b3f4684461729e6f698c3195bf7ea8483b6807bf70589a3fc1e634fdb1c16" exitCode=255 Mar 19 12:06:57.761056 master-0 kubenswrapper[7642]: I0319 12:06:57.760591 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerDied","Data":"081b3f4684461729e6f698c3195bf7ea8483b6807bf70589a3fc1e634fdb1c16"} Mar 19 12:06:57.761056 master-0 kubenswrapper[7642]: I0319 12:06:57.760654 7642 scope.go:117] "RemoveContainer" containerID="a5cb97fd211a0ea441e72f2a15543fd5127539a45c24f3604c7d9c65dbb4100e" Mar 19 12:06:57.761267 master-0 kubenswrapper[7642]: I0319 12:06:57.761239 7642 scope.go:117] "RemoveContainer" containerID="081b3f4684461729e6f698c3195bf7ea8483b6807bf70589a3fc1e634fdb1c16" Mar 19 12:06:57.761676 master-0 kubenswrapper[7642]: E0319 12:06:57.761592 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-operator pod=service-ca-operator-b865698dc-czzvw_openshift-service-ca-operator(3f3c0234-248a-4d6f-9aca-6582a481a911)\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" podUID="3f3c0234-248a-4d6f-9aca-6582a481a911" Mar 19 12:06:58.769691 master-0 kubenswrapper[7642]: I0319 12:06:58.769596 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/1.log" Mar 19 12:06:58.770795 master-0 kubenswrapper[7642]: I0319 12:06:58.770757 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/0.log" Mar 19 12:06:58.770881 master-0 kubenswrapper[7642]: I0319 12:06:58.770812 7642 generic.go:334] "Generic (PLEG): container finished" podID="e90c10f6-36d9-44bd-a4c6-174f12135434" containerID="3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a" exitCode=1 Mar 19 12:06:58.770881 master-0 kubenswrapper[7642]: I0319 12:06:58.770852 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerDied","Data":"3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a"} Mar 19 12:06:58.771006 master-0 kubenswrapper[7642]: I0319 12:06:58.770912 7642 scope.go:117] "RemoveContainer" containerID="cf537f49e2a1b4ab25a2740d2ec9131fff4692e5f665c069887b93bcb77ec276" Mar 19 12:06:58.771508 master-0 kubenswrapper[7642]: I0319 12:06:58.771463 7642 scope.go:117] "RemoveContainer" containerID="3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a" Mar 19 12:06:58.771814 master-0 kubenswrapper[7642]: E0319 12:06:58.771771 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-6f69995874-cwncj_openshift-machine-api(e90c10f6-36d9-44bd-a4c6-174f12135434)\"" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" podUID="e90c10f6-36d9-44bd-a4c6-174f12135434" Mar 19 12:06:58.774541 master-0 kubenswrapper[7642]: I0319 12:06:58.774459 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/2.log" Mar 19 12:06:59.068323 master-0 kubenswrapper[7642]: I0319 12:06:59.068187 7642 scope.go:117] "RemoveContainer" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" Mar 19 12:06:59.068503 master-0 kubenswrapper[7642]: E0319 12:06:59.068400 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:06:59.781538 master-0 kubenswrapper[7642]: I0319 12:06:59.781466 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/1.log" Mar 19 12:07:01.482908 master-0 kubenswrapper[7642]: E0319 12:07:01.482810 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:51Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:51Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:51Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:06:51Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:02.424295 master-0 kubenswrapper[7642]: I0319 12:07:02.424234 7642 status_manager.go:851] "Failed to get status for pod" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods ingress-operator-66b84d69b-59ptg)" Mar 19 12:07:03.808480 master-0 kubenswrapper[7642]: I0319 12:07:03.808361 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/3.log" Mar 19 12:07:03.809575 master-0 kubenswrapper[7642]: I0319 12:07:03.809523 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/2.log" Mar 19 12:07:03.809666 master-0 kubenswrapper[7642]: I0319 12:07:03.809589 7642 generic.go:334] "Generic (PLEG): container finished" podID="0c804ea0-3483-4506-b138-3bf30016d8d8" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" exitCode=1 Mar 19 12:07:03.809666 master-0 kubenswrapper[7642]: I0319 12:07:03.809627 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerDied","Data":"a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec"} Mar 19 12:07:03.809776 master-0 kubenswrapper[7642]: I0319 12:07:03.809688 7642 scope.go:117] "RemoveContainer" containerID="dbc335899765a665194d89fe620e023da77d1cd1af1bebd5c37e9898ef1b1c48" Mar 19 12:07:03.810982 master-0 kubenswrapper[7642]: I0319 12:07:03.810918 7642 scope.go:117] "RemoveContainer" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" Mar 19 12:07:03.811835 master-0 kubenswrapper[7642]: E0319 12:07:03.811763 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:07:04.312026 master-0 kubenswrapper[7642]: E0319 12:07:04.311960 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:07:04.818994 master-0 kubenswrapper[7642]: I0319 12:07:04.818924 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/3.log" Mar 19 12:07:11.484844 master-0 kubenswrapper[7642]: E0319 12:07:11.483261 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:12.071268 master-0 kubenswrapper[7642]: I0319 12:07:12.071210 7642 scope.go:117] "RemoveContainer" containerID="081b3f4684461729e6f698c3195bf7ea8483b6807bf70589a3fc1e634fdb1c16" Mar 19 12:07:12.875496 master-0 kubenswrapper[7642]: I0319 12:07:12.875335 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/2.log" Mar 19 12:07:12.875496 master-0 kubenswrapper[7642]: I0319 12:07:12.875441 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c"} Mar 19 12:07:13.068178 master-0 kubenswrapper[7642]: I0319 12:07:13.068116 7642 scope.go:117] "RemoveContainer" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" Mar 19 12:07:13.068487 master-0 kubenswrapper[7642]: I0319 12:07:13.068248 7642 scope.go:117] "RemoveContainer" containerID="3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a" Mar 19 12:07:13.068487 master-0 kubenswrapper[7642]: E0319 12:07:13.068388 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:07:13.884766 master-0 kubenswrapper[7642]: I0319 12:07:13.884725 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/1.log" Mar 19 12:07:13.885706 master-0 kubenswrapper[7642]: I0319 12:07:13.885679 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"a7d2563fae71ee703cf5765c15eba0a6c9d82c859e7cfd778f661a109011742f"} Mar 19 12:07:16.068046 master-0 kubenswrapper[7642]: I0319 12:07:16.067973 7642 scope.go:117] "RemoveContainer" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" Mar 19 12:07:16.068718 master-0 kubenswrapper[7642]: E0319 12:07:16.068267 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:07:21.314054 master-0 kubenswrapper[7642]: E0319 12:07:21.313954 7642 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 19 12:07:21.484135 master-0 kubenswrapper[7642]: E0319 12:07:21.484069 7642 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:22.942141 master-0 kubenswrapper[7642]: I0319 12:07:22.942046 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-vt498_162b513f-2f67-4946-816b-48f6232dfca0/authentication-operator/1.log" Mar 19 12:07:22.943157 master-0 kubenswrapper[7642]: I0319 12:07:22.942891 7642 generic.go:334] "Generic (PLEG): container finished" podID="162b513f-2f67-4946-816b-48f6232dfca0" containerID="2ae789390c2dc2661b322d4ce95cf71fe928a1384e00118bd9c25a6d60f35352" exitCode=1 Mar 19 12:07:22.943157 master-0 kubenswrapper[7642]: I0319 12:07:22.942976 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerDied","Data":"2ae789390c2dc2661b322d4ce95cf71fe928a1384e00118bd9c25a6d60f35352"} Mar 19 12:07:22.943157 master-0 kubenswrapper[7642]: I0319 12:07:22.943051 7642 scope.go:117] "RemoveContainer" containerID="a82bb27ad062e4cf63ae115331a67a108d2b42bc563ab5c88f2c0b48ab45fdcd" Mar 19 12:07:22.944037 master-0 kubenswrapper[7642]: I0319 12:07:22.943974 7642 scope.go:117] "RemoveContainer" containerID="2ae789390c2dc2661b322d4ce95cf71fe928a1384e00118bd9c25a6d60f35352" Mar 19 12:07:23.952102 master-0 kubenswrapper[7642]: I0319 12:07:23.952005 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-vt498_162b513f-2f67-4946-816b-48f6232dfca0/authentication-operator/1.log" Mar 19 12:07:23.952102 master-0 kubenswrapper[7642]: I0319 12:07:23.952095 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerStarted","Data":"416d98dbc9819bb0bc11ccd6915d47895190d1c6a05b913f67a9dc85ee34a4a3"} Mar 19 12:07:24.068480 master-0 kubenswrapper[7642]: I0319 12:07:24.068416 7642 scope.go:117] "RemoveContainer" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" Mar 19 12:07:24.962262 master-0 kubenswrapper[7642]: I0319 12:07:24.962183 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/3.log" Mar 19 12:07:24.963783 master-0 kubenswrapper[7642]: I0319 12:07:24.963721 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:07:24.963947 master-0 kubenswrapper[7642]: I0319 12:07:24.963789 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9"} Mar 19 12:07:25.134684 master-0 kubenswrapper[7642]: I0319 12:07:25.134608 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:25.134684 master-0 kubenswrapper[7642]: I0319 12:07:25.134690 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:25.977164 master-0 kubenswrapper[7642]: I0319 12:07:25.977076 7642 generic.go:334] "Generic (PLEG): container finished" podID="1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6" containerID="285371d8dc1391b2aefa34f9aa23ab25dae7de5c3d27157be1b42472a35ec621" exitCode=0 Mar 19 12:07:25.978285 master-0 kubenswrapper[7642]: I0319 12:07:25.977157 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerDied","Data":"285371d8dc1391b2aefa34f9aa23ab25dae7de5c3d27157be1b42472a35ec621"} Mar 19 12:07:25.978285 master-0 kubenswrapper[7642]: I0319 12:07:25.977788 7642 scope.go:117] "RemoveContainer" containerID="285371d8dc1391b2aefa34f9aa23ab25dae7de5c3d27157be1b42472a35ec621" Mar 19 12:07:25.981243 master-0 kubenswrapper[7642]: I0319 12:07:25.981146 7642 generic.go:334] "Generic (PLEG): container finished" podID="aab47a39-a18b-4350-b141-4d5de2d8e96f" containerID="3385d26311ce81e00023dfaf9b5057451bdc1320414436c96c2f92229792ca5f" exitCode=0 Mar 19 12:07:25.981438 master-0 kubenswrapper[7642]: I0319 12:07:25.981251 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerDied","Data":"3385d26311ce81e00023dfaf9b5057451bdc1320414436c96c2f92229792ca5f"} Mar 19 12:07:25.981438 master-0 kubenswrapper[7642]: I0319 12:07:25.981285 7642 scope.go:117] "RemoveContainer" containerID="7ed370adba82a4b171cad234b3d380f3d393808daf0a0a8fd7c7e46ac33f3ee6" Mar 19 12:07:25.981864 master-0 kubenswrapper[7642]: I0319 12:07:25.981844 7642 scope.go:117] "RemoveContainer" containerID="3385d26311ce81e00023dfaf9b5057451bdc1320414436c96c2f92229792ca5f" Mar 19 12:07:25.986032 master-0 kubenswrapper[7642]: I0319 12:07:25.985962 7642 generic.go:334] "Generic (PLEG): container finished" podID="6d017409-a362-4e58-87af-d02b3834fdd4" containerID="016ff43860287ef9ea9c7834d6813f132931ea72467cbcc206c5b8bf5a2a1e8b" exitCode=0 Mar 19 12:07:25.986180 master-0 kubenswrapper[7642]: I0319 12:07:25.986063 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerDied","Data":"016ff43860287ef9ea9c7834d6813f132931ea72467cbcc206c5b8bf5a2a1e8b"} Mar 19 12:07:25.986789 master-0 kubenswrapper[7642]: I0319 12:07:25.986711 7642 scope.go:117] "RemoveContainer" containerID="016ff43860287ef9ea9c7834d6813f132931ea72467cbcc206c5b8bf5a2a1e8b" Mar 19 12:07:25.989405 master-0 kubenswrapper[7642]: I0319 12:07:25.989355 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-d65958b8-ptjr9_84e5577e-df8b-46a2-aff7-0bf90b5010d9/openshift-apiserver-operator/1.log" Mar 19 12:07:25.989557 master-0 kubenswrapper[7642]: I0319 12:07:25.989408 7642 generic.go:334] "Generic (PLEG): container finished" podID="84e5577e-df8b-46a2-aff7-0bf90b5010d9" containerID="d743fbff3bc780b360a4927f7a272f0fb5047ed94867c7848a01cf51863441ae" exitCode=0 Mar 19 12:07:25.989557 master-0 kubenswrapper[7642]: I0319 12:07:25.989463 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerDied","Data":"d743fbff3bc780b360a4927f7a272f0fb5047ed94867c7848a01cf51863441ae"} Mar 19 12:07:25.989828 master-0 kubenswrapper[7642]: I0319 12:07:25.989788 7642 scope.go:117] "RemoveContainer" containerID="d743fbff3bc780b360a4927f7a272f0fb5047ed94867c7848a01cf51863441ae" Mar 19 12:07:25.997221 master-0 kubenswrapper[7642]: I0319 12:07:25.996656 7642 generic.go:334] "Generic (PLEG): container finished" podID="eb274dd5-e98e-4485-a050-7bec560a0daa" containerID="ecee444cb43be9a1ddd0e2f81e61eca30a1f71fa25000dc8d0f2c58cd47a25c6" exitCode=0 Mar 19 12:07:25.997221 master-0 kubenswrapper[7642]: I0319 12:07:25.996767 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerDied","Data":"ecee444cb43be9a1ddd0e2f81e61eca30a1f71fa25000dc8d0f2c58cd47a25c6"} Mar 19 12:07:25.997892 master-0 kubenswrapper[7642]: I0319 12:07:25.997839 7642 scope.go:117] "RemoveContainer" containerID="ecee444cb43be9a1ddd0e2f81e61eca30a1f71fa25000dc8d0f2c58cd47a25c6" Mar 19 12:07:26.006835 master-0 kubenswrapper[7642]: I0319 12:07:26.006623 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-ff989d6cc-8xkv5_ac946049-4b93-4d8c-bfa6-eacf83160ed1/kube-controller-manager-operator/1.log" Mar 19 12:07:26.006835 master-0 kubenswrapper[7642]: I0319 12:07:26.006741 7642 generic.go:334] "Generic (PLEG): container finished" podID="ac946049-4b93-4d8c-bfa6-eacf83160ed1" containerID="c8521feb576750f4b4cf92dbf1c496a19af50a44e812267935ec3c28f71bb348" exitCode=0 Mar 19 12:07:26.006966 master-0 kubenswrapper[7642]: I0319 12:07:26.006846 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerDied","Data":"c8521feb576750f4b4cf92dbf1c496a19af50a44e812267935ec3c28f71bb348"} Mar 19 12:07:26.007460 master-0 kubenswrapper[7642]: I0319 12:07:26.007412 7642 scope.go:117] "RemoveContainer" containerID="c8521feb576750f4b4cf92dbf1c496a19af50a44e812267935ec3c28f71bb348" Mar 19 12:07:26.023115 master-0 kubenswrapper[7642]: I0319 12:07:26.022989 7642 scope.go:117] "RemoveContainer" containerID="57848ee43bd88e9ed1ccc7c15c3a6279821a813bcb2340426181dccb4cafcc2a" Mar 19 12:07:26.026203 master-0 kubenswrapper[7642]: I0319 12:07:26.026151 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8c94f4649-q46ht_d263ddac-1ede-474f-b3bb-1f1f4f7846a3/openshift-controller-manager-operator/0.log" Mar 19 12:07:26.026385 master-0 kubenswrapper[7642]: I0319 12:07:26.026253 7642 generic.go:334] "Generic (PLEG): container finished" podID="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" containerID="7fc3d4238bc0fa4082c65174aac64034c89fc65c891756149ed46eacde6bb87a" exitCode=0 Mar 19 12:07:26.026505 master-0 kubenswrapper[7642]: I0319 12:07:26.026387 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerDied","Data":"7fc3d4238bc0fa4082c65174aac64034c89fc65c891756149ed46eacde6bb87a"} Mar 19 12:07:26.028291 master-0 kubenswrapper[7642]: I0319 12:07:26.027610 7642 scope.go:117] "RemoveContainer" containerID="7fc3d4238bc0fa4082c65174aac64034c89fc65c891756149ed46eacde6bb87a" Mar 19 12:07:26.032137 master-0 kubenswrapper[7642]: I0319 12:07:26.030615 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7bd846bfc4-5qtx7_b0133cb5-e425-40af-a051-71b2362a89c3/network-operator/1.log" Mar 19 12:07:26.032137 master-0 kubenswrapper[7642]: I0319 12:07:26.030710 7642 generic.go:334] "Generic (PLEG): container finished" podID="b0133cb5-e425-40af-a051-71b2362a89c3" containerID="a0078d1886890ad6f9b53500e856885a52b81c9832530c9463ebafbbef1dc099" exitCode=0 Mar 19 12:07:26.032137 master-0 kubenswrapper[7642]: I0319 12:07:26.030793 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerDied","Data":"a0078d1886890ad6f9b53500e856885a52b81c9832530c9463ebafbbef1dc099"} Mar 19 12:07:26.032137 master-0 kubenswrapper[7642]: I0319 12:07:26.031556 7642 scope.go:117] "RemoveContainer" containerID="a0078d1886890ad6f9b53500e856885a52b81c9832530c9463ebafbbef1dc099" Mar 19 12:07:26.040462 master-0 kubenswrapper[7642]: I0319 12:07:26.040390 7642 generic.go:334] "Generic (PLEG): container finished" podID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerID="da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b" exitCode=0 Mar 19 12:07:26.040630 master-0 kubenswrapper[7642]: I0319 12:07:26.040573 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerDied","Data":"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b"} Mar 19 12:07:26.041531 master-0 kubenswrapper[7642]: I0319 12:07:26.041492 7642 scope.go:117] "RemoveContainer" containerID="da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b" Mar 19 12:07:26.096938 master-0 kubenswrapper[7642]: I0319 12:07:26.096267 7642 scope.go:117] "RemoveContainer" containerID="6ec837485683b9ef9f8903f02c33b71f9f384c302a364201ed3b85fde2544f12" Mar 19 12:07:26.175720 master-0 kubenswrapper[7642]: I0319 12:07:26.173888 7642 scope.go:117] "RemoveContainer" containerID="8599681efde4807c1ee51fb15ce18a51bf6eae443d3b45fdd6b7cd611d18283f" Mar 19 12:07:26.290662 master-0 kubenswrapper[7642]: I0319 12:07:26.290170 7642 scope.go:117] "RemoveContainer" containerID="1bd7cb8f485e09b49dad795958f587384c11e7992bcea8c49f1cbb606f9399d9" Mar 19 12:07:26.333580 master-0 kubenswrapper[7642]: I0319 12:07:26.333547 7642 scope.go:117] "RemoveContainer" containerID="0cc3ae6879b287fcc4bae5ddccb1f2e7e024042e8efb6dfd62d69b2efa286229" Mar 19 12:07:27.049996 master-0 kubenswrapper[7642]: I0319 12:07:27.049939 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"97ebb249079488276478fc372daed13b36528171102a7e1e1aab77ef4823b2a5"} Mar 19 12:07:27.052400 master-0 kubenswrapper[7642]: I0319 12:07:27.052362 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerStarted","Data":"6af67af236dd0969ccd453e6a767e729b40cda7e647e2cedf72b5516dcb3f553"} Mar 19 12:07:27.055493 master-0 kubenswrapper[7642]: I0319 12:07:27.055448 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"91d2d9fbd93675cb6b8588342c80fd98749d4864eec3bdb5cf0aed0f65fff704"} Mar 19 12:07:27.058063 master-0 kubenswrapper[7642]: I0319 12:07:27.058013 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerStarted","Data":"2c48d09a5f376ea01a7df4083c720e54288e0edcd8e095a1b16617d5d890a246"} Mar 19 12:07:27.064475 master-0 kubenswrapper[7642]: I0319 12:07:27.064418 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"890c19caa46394b533dbc648540873c8fe9d3b463f97ad09cc30eec2d6354bff"} Mar 19 12:07:27.067669 master-0 kubenswrapper[7642]: I0319 12:07:27.067592 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"ab7fdc4501825667c1d30afdaf806675afdeb81ef5c5029c7fc21f6a98742e71"} Mar 19 12:07:27.070079 master-0 kubenswrapper[7642]: I0319 12:07:27.070042 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerStarted","Data":"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72"} Mar 19 12:07:27.070831 master-0 kubenswrapper[7642]: I0319 12:07:27.070811 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:07:27.072986 master-0 kubenswrapper[7642]: I0319 12:07:27.072964 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerStarted","Data":"54294092abe32456d3b18b9bdddc46fd2b3ebcf51048219020a08440353bf3ae"} Mar 19 12:07:27.078038 master-0 kubenswrapper[7642]: I0319 12:07:27.077579 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"9b8bb7056828913959d244048b2d339d012a91e455be453f7da59e9b6a47621a"} Mar 19 12:07:28.071873 master-0 kubenswrapper[7642]: I0319 12:07:28.071778 7642 patch_prober.go:28] interesting pod/route-controller-manager-6bcb8645f8-5z6xt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:28.072461 master-0 kubenswrapper[7642]: I0319 12:07:28.071881 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:28.135756 master-0 kubenswrapper[7642]: I0319 12:07:28.135580 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:28.135756 master-0 kubenswrapper[7642]: I0319 12:07:28.135705 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:29.068246 master-0 kubenswrapper[7642]: I0319 12:07:29.068168 7642 scope.go:117] "RemoveContainer" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" Mar 19 12:07:29.068612 master-0 kubenswrapper[7642]: E0319 12:07:29.068417 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:07:29.086614 master-0 kubenswrapper[7642]: I0319 12:07:29.086499 7642 patch_prober.go:28] interesting pod/route-controller-manager-6bcb8645f8-5z6xt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:29.086614 master-0 kubenswrapper[7642]: I0319 12:07:29.086600 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:34.027541 master-0 kubenswrapper[7642]: I0319 12:07:34.027480 7642 patch_prober.go:28] interesting pod/route-controller-manager-6bcb8645f8-5z6xt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:34.028150 master-0 kubenswrapper[7642]: I0319 12:07:34.027564 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:35.169124 master-0 kubenswrapper[7642]: I0319 12:07:35.169062 7642 generic.go:334] "Generic (PLEG): container finished" podID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerID="246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d" exitCode=0 Mar 19 12:07:35.169124 master-0 kubenswrapper[7642]: I0319 12:07:35.169110 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerDied","Data":"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d"} Mar 19 12:07:35.173049 master-0 kubenswrapper[7642]: I0319 12:07:35.169149 7642 scope.go:117] "RemoveContainer" containerID="4d73ea74c9f238cd74823cfff5517f40d001e3ed453b6b3cd60056853897aa07" Mar 19 12:07:36.178914 master-0 kubenswrapper[7642]: I0319 12:07:36.178818 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30"} Mar 19 12:07:36.968988 master-0 kubenswrapper[7642]: I0319 12:07:36.968911 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:07:36.976035 master-0 kubenswrapper[7642]: I0319 12:07:36.975990 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:36.976035 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:36.976035 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:36.976035 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:36.976235 master-0 kubenswrapper[7642]: I0319 12:07:36.976073 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:37.972342 master-0 kubenswrapper[7642]: I0319 12:07:37.972264 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:37.972342 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:37.972342 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:37.972342 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:37.972342 master-0 kubenswrapper[7642]: I0319 12:07:37.972340 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:38.135230 master-0 kubenswrapper[7642]: I0319 12:07:38.135096 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:38.135230 master-0 kubenswrapper[7642]: I0319 12:07:38.135219 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:38.971231 master-0 kubenswrapper[7642]: I0319 12:07:38.971189 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:38.971231 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:38.971231 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:38.971231 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:38.971526 master-0 kubenswrapper[7642]: I0319 12:07:38.971246 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:39.971560 master-0 kubenswrapper[7642]: I0319 12:07:39.971480 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:39.971560 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:39.971560 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:39.971560 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:39.972185 master-0 kubenswrapper[7642]: I0319 12:07:39.971590 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:40.972210 master-0 kubenswrapper[7642]: I0319 12:07:40.972100 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:40.972210 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:40.972210 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:40.972210 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:40.973388 master-0 kubenswrapper[7642]: I0319 12:07:40.972291 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:41.971347 master-0 kubenswrapper[7642]: I0319 12:07:41.971268 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:41.971347 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:41.971347 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:41.971347 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:41.971931 master-0 kubenswrapper[7642]: I0319 12:07:41.971363 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:42.072086 master-0 kubenswrapper[7642]: I0319 12:07:42.072035 7642 scope.go:117] "RemoveContainer" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" Mar 19 12:07:42.072564 master-0 kubenswrapper[7642]: E0319 12:07:42.072297 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-64854d9cff-6dlpw_openshift-cluster-storage-operator(0c804ea0-3483-4506-b138-3bf30016d8d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" podUID="0c804ea0-3483-4506-b138-3bf30016d8d8" Mar 19 12:07:42.969393 master-0 kubenswrapper[7642]: I0319 12:07:42.969304 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:07:42.971666 master-0 kubenswrapper[7642]: I0319 12:07:42.971581 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:42.971666 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:42.971666 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:42.971666 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:42.971900 master-0 kubenswrapper[7642]: I0319 12:07:42.971660 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:43.971518 master-0 kubenswrapper[7642]: I0319 12:07:43.971450 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:43.971518 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:43.971518 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:43.971518 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:43.971518 master-0 kubenswrapper[7642]: I0319 12:07:43.971513 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:44.028161 master-0 kubenswrapper[7642]: I0319 12:07:44.028085 7642 patch_prober.go:28] interesting pod/route-controller-manager-6bcb8645f8-5z6xt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 12:07:44.028411 master-0 kubenswrapper[7642]: I0319 12:07:44.028167 7642 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.61:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 12:07:44.971805 master-0 kubenswrapper[7642]: I0319 12:07:44.971716 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:44.971805 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:44.971805 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:44.971805 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:44.972366 master-0 kubenswrapper[7642]: I0319 12:07:44.971851 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:45.972063 master-0 kubenswrapper[7642]: I0319 12:07:45.971982 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:45.972063 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:45.972063 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:45.972063 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:45.972063 master-0 kubenswrapper[7642]: I0319 12:07:45.972060 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:46.971992 master-0 kubenswrapper[7642]: I0319 12:07:46.971931 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:46.971992 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:46.971992 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:46.971992 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:46.973495 master-0 kubenswrapper[7642]: I0319 12:07:46.972006 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:47.972763 master-0 kubenswrapper[7642]: I0319 12:07:47.972605 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:47.972763 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:47.972763 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:47.972763 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:47.974053 master-0 kubenswrapper[7642]: I0319 12:07:47.972768 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:48.136862 master-0 kubenswrapper[7642]: I0319 12:07:48.136713 7642 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded" start-of-body= Mar 19 12:07:48.136862 master-0 kubenswrapper[7642]: I0319 12:07:48.136842 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded" Mar 19 12:07:48.137439 master-0 kubenswrapper[7642]: I0319 12:07:48.136956 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:48.138405 master-0 kubenswrapper[7642]: I0319 12:07:48.138342 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 12:07:48.138537 master-0 kubenswrapper[7642]: I0319 12:07:48.138506 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" containerID="cri-o://f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" gracePeriod=30 Mar 19 12:07:48.971165 master-0 kubenswrapper[7642]: I0319 12:07:48.971078 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:48.971165 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:48.971165 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:48.971165 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:48.971801 master-0 kubenswrapper[7642]: I0319 12:07:48.971180 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:49.973205 master-0 kubenswrapper[7642]: I0319 12:07:49.973087 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:49.973205 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:49.973205 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:49.973205 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:49.974396 master-0 kubenswrapper[7642]: I0319 12:07:49.973239 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:50.972308 master-0 kubenswrapper[7642]: I0319 12:07:50.972243 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:50.972308 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:50.972308 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:50.972308 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:50.972819 master-0 kubenswrapper[7642]: I0319 12:07:50.972332 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:51.971183 master-0 kubenswrapper[7642]: I0319 12:07:51.971121 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:51.971183 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:51.971183 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:51.971183 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:51.971724 master-0 kubenswrapper[7642]: I0319 12:07:51.971213 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:52.305603 master-0 kubenswrapper[7642]: I0319 12:07:52.305497 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:07:52.306109 master-0 kubenswrapper[7642]: I0319 12:07:52.306086 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/3.log" Mar 19 12:07:52.307459 master-0 kubenswrapper[7642]: I0319 12:07:52.307446 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:07:52.307565 master-0 kubenswrapper[7642]: I0319 12:07:52.307545 7642 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" exitCode=255 Mar 19 12:07:52.307694 master-0 kubenswrapper[7642]: I0319 12:07:52.307676 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9"} Mar 19 12:07:52.307798 master-0 kubenswrapper[7642]: I0319 12:07:52.307786 7642 scope.go:117] "RemoveContainer" containerID="0f62614b26a16c18e72cd40ca867f5b2b06dc30cf1a1598301a129de62de79bc" Mar 19 12:07:52.309915 master-0 kubenswrapper[7642]: I0319 12:07:52.309901 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/3.log" Mar 19 12:07:52.310340 master-0 kubenswrapper[7642]: I0319 12:07:52.310328 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/2.log" Mar 19 12:07:52.310426 master-0 kubenswrapper[7642]: I0319 12:07:52.310412 7642 generic.go:334] "Generic (PLEG): container finished" podID="3f3c0234-248a-4d6f-9aca-6582a481a911" containerID="9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c" exitCode=255 Mar 19 12:07:52.310496 master-0 kubenswrapper[7642]: I0319 12:07:52.310484 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerDied","Data":"9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c"} Mar 19 12:07:52.310984 master-0 kubenswrapper[7642]: I0319 12:07:52.310970 7642 scope.go:117] "RemoveContainer" containerID="9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c" Mar 19 12:07:52.311231 master-0 kubenswrapper[7642]: E0319 12:07:52.311215 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=service-ca-operator pod=service-ca-operator-b865698dc-czzvw_openshift-service-ca-operator(3f3c0234-248a-4d6f-9aca-6582a481a911)\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" podUID="3f3c0234-248a-4d6f-9aca-6582a481a911" Mar 19 12:07:52.331518 master-0 kubenswrapper[7642]: I0319 12:07:52.331398 7642 scope.go:117] "RemoveContainer" containerID="081b3f4684461729e6f698c3195bf7ea8483b6807bf70589a3fc1e634fdb1c16" Mar 19 12:07:52.465974 master-0 kubenswrapper[7642]: E0319 12:07:52.465912 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:07:52.971724 master-0 kubenswrapper[7642]: I0319 12:07:52.971603 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:52.971724 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:52.971724 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:52.971724 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:52.972783 master-0 kubenswrapper[7642]: I0319 12:07:52.971748 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:53.034274 master-0 kubenswrapper[7642]: I0319 12:07:53.034202 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:07:53.323131 master-0 kubenswrapper[7642]: I0319 12:07:53.322986 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/3.log" Mar 19 12:07:53.326804 master-0 kubenswrapper[7642]: I0319 12:07:53.326749 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:07:53.329815 master-0 kubenswrapper[7642]: I0319 12:07:53.329782 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:07:53.330516 master-0 kubenswrapper[7642]: I0319 12:07:53.330477 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:07:53.330890 master-0 kubenswrapper[7642]: E0319 12:07:53.330835 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:07:53.972751 master-0 kubenswrapper[7642]: I0319 12:07:53.972601 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:53.972751 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:53.972751 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:53.972751 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:53.973756 master-0 kubenswrapper[7642]: I0319 12:07:53.972788 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:54.971546 master-0 kubenswrapper[7642]: I0319 12:07:54.971485 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:54.971546 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:54.971546 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:54.971546 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:54.972004 master-0 kubenswrapper[7642]: I0319 12:07:54.971559 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:55.135023 master-0 kubenswrapper[7642]: I0319 12:07:55.134949 7642 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:07:55.136181 master-0 kubenswrapper[7642]: I0319 12:07:55.136156 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:07:55.136519 master-0 kubenswrapper[7642]: E0319 12:07:55.136490 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:07:55.972213 master-0 kubenswrapper[7642]: I0319 12:07:55.972067 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:55.972213 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:55.972213 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:55.972213 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:55.972604 master-0 kubenswrapper[7642]: I0319 12:07:55.972198 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:56.068297 master-0 kubenswrapper[7642]: I0319 12:07:56.068233 7642 scope.go:117] "RemoveContainer" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" Mar 19 12:07:56.355490 master-0 kubenswrapper[7642]: I0319 12:07:56.355332 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/3.log" Mar 19 12:07:56.355490 master-0 kubenswrapper[7642]: I0319 12:07:56.355411 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"e07058ba721733098fb4279e6e48522183a341fb7e3449f5a7693aab50f42f9f"} Mar 19 12:07:56.971948 master-0 kubenswrapper[7642]: I0319 12:07:56.971864 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:56.971948 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:56.971948 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:56.971948 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:56.972311 master-0 kubenswrapper[7642]: I0319 12:07:56.971975 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:57.973024 master-0 kubenswrapper[7642]: I0319 12:07:57.972911 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:57.973024 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:57.973024 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:57.973024 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:57.974085 master-0 kubenswrapper[7642]: I0319 12:07:57.973014 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:58.972018 master-0 kubenswrapper[7642]: I0319 12:07:58.971920 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:58.972018 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:58.972018 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:58.972018 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:58.972594 master-0 kubenswrapper[7642]: I0319 12:07:58.972060 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:07:59.971442 master-0 kubenswrapper[7642]: I0319 12:07:59.971361 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:07:59.971442 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:07:59.971442 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:07:59.971442 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:07:59.972075 master-0 kubenswrapper[7642]: I0319 12:07:59.971469 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:00.972147 master-0 kubenswrapper[7642]: I0319 12:08:00.972070 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:00.972147 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:00.972147 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:00.972147 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:00.972787 master-0 kubenswrapper[7642]: I0319 12:08:00.972161 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:01.970879 master-0 kubenswrapper[7642]: I0319 12:08:01.970819 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:01.970879 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:01.970879 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:01.970879 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:01.971275 master-0 kubenswrapper[7642]: I0319 12:08:01.970880 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:02.971913 master-0 kubenswrapper[7642]: I0319 12:08:02.971849 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:02.971913 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:02.971913 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:02.971913 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:02.972471 master-0 kubenswrapper[7642]: I0319 12:08:02.971917 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:03.067595 master-0 kubenswrapper[7642]: I0319 12:08:03.067553 7642 scope.go:117] "RemoveContainer" containerID="9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c" Mar 19 12:08:03.067810 master-0 kubenswrapper[7642]: E0319 12:08:03.067781 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=service-ca-operator pod=service-ca-operator-b865698dc-czzvw_openshift-service-ca-operator(3f3c0234-248a-4d6f-9aca-6582a481a911)\"" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" podUID="3f3c0234-248a-4d6f-9aca-6582a481a911" Mar 19 12:08:03.826194 master-0 kubenswrapper[7642]: I0319 12:08:03.826109 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-rjdjl"] Mar 19 12:08:03.826507 master-0 kubenswrapper[7642]: I0319 12:08:03.826470 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="telemeter-client" containerID="cri-o://55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" gracePeriod=30 Mar 19 12:08:03.826565 master-0 kubenswrapper[7642]: I0319 12:08:03.826530 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="kube-rbac-proxy" containerID="cri-o://672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" gracePeriod=30 Mar 19 12:08:03.826783 master-0 kubenswrapper[7642]: I0319 12:08:03.826625 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="reload" containerID="cri-o://e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" gracePeriod=30 Mar 19 12:08:03.970932 master-0 kubenswrapper[7642]: I0319 12:08:03.970574 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:03.970932 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:03.970932 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:03.970932 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:03.970932 master-0 kubenswrapper[7642]: I0319 12:08:03.970628 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:04.276488 master-0 kubenswrapper[7642]: I0319 12:08:04.276429 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-rjdjl_596a0930-a082-4374-acc9-b722c4970443/telemeter-client/0.log" Mar 19 12:08:04.277521 master-0 kubenswrapper[7642]: I0319 12:08:04.276523 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 12:08:04.387875 master-0 kubenswrapper[7642]: I0319 12:08:04.387828 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-telemeter-trusted-ca-bundle\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388098 master-0 kubenswrapper[7642]: I0319 12:08:04.387932 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client-kube-rbac-proxy-config\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388098 master-0 kubenswrapper[7642]: I0319 12:08:04.387982 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-federate-client-tls\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388098 master-0 kubenswrapper[7642]: I0319 12:08:04.388003 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-telemeter-client-tls\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388098 master-0 kubenswrapper[7642]: I0319 12:08:04.388019 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-metrics-client-ca\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388305 master-0 kubenswrapper[7642]: I0319 12:08:04.388140 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-serving-certs-ca-bundle\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388305 master-0 kubenswrapper[7642]: I0319 12:08:04.388163 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.388775 master-0 kubenswrapper[7642]: I0319 12:08:04.388725 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-serving-certs-ca-bundle" (OuterVolumeSpecName: "serving-certs-ca-bundle") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:08:04.389112 master-0 kubenswrapper[7642]: I0319 12:08:04.389068 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:08:04.389335 master-0 kubenswrapper[7642]: I0319 12:08:04.389301 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwx2w\" (UniqueName: \"kubernetes.io/projected/596a0930-a082-4374-acc9-b722c4970443-kube-api-access-xwx2w\") pod \"596a0930-a082-4374-acc9-b722c4970443\" (UID: \"596a0930-a082-4374-acc9-b722c4970443\") " Mar 19 12:08:04.389578 master-0 kubenswrapper[7642]: I0319 12:08:04.389499 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-telemeter-trusted-ca-bundle" (OuterVolumeSpecName: "telemeter-trusted-ca-bundle") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "telemeter-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:08:04.389900 master-0 kubenswrapper[7642]: I0319 12:08:04.389864 7642 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.389964 master-0 kubenswrapper[7642]: I0319 12:08:04.389919 7642 reconciler_common.go:293] "Volume detached for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.389964 master-0 kubenswrapper[7642]: I0319 12:08:04.389934 7642 reconciler_common.go:293] "Volume detached for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/596a0930-a082-4374-acc9-b722c4970443-telemeter-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.391474 master-0 kubenswrapper[7642]: I0319 12:08:04.391408 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-federate-client-tls" (OuterVolumeSpecName: "federate-client-tls") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "federate-client-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:08:04.391628 master-0 kubenswrapper[7642]: I0319 12:08:04.391585 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client-kube-rbac-proxy-config" (OuterVolumeSpecName: "secret-telemeter-client-kube-rbac-proxy-config") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "secret-telemeter-client-kube-rbac-proxy-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:08:04.392107 master-0 kubenswrapper[7642]: I0319 12:08:04.392026 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client" (OuterVolumeSpecName: "secret-telemeter-client") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "secret-telemeter-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:08:04.392761 master-0 kubenswrapper[7642]: I0319 12:08:04.392717 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596a0930-a082-4374-acc9-b722c4970443-kube-api-access-xwx2w" (OuterVolumeSpecName: "kube-api-access-xwx2w") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "kube-api-access-xwx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:08:04.393229 master-0 kubenswrapper[7642]: I0319 12:08:04.393083 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-telemeter-client-tls" (OuterVolumeSpecName: "telemeter-client-tls") pod "596a0930-a082-4374-acc9-b722c4970443" (UID: "596a0930-a082-4374-acc9-b722c4970443"). InnerVolumeSpecName "telemeter-client-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:08:04.420084 master-0 kubenswrapper[7642]: I0319 12:08:04.420036 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-rjdjl_596a0930-a082-4374-acc9-b722c4970443/telemeter-client/0.log" Mar 19 12:08:04.420084 master-0 kubenswrapper[7642]: I0319 12:08:04.420099 7642 generic.go:334] "Generic (PLEG): container finished" podID="596a0930-a082-4374-acc9-b722c4970443" containerID="672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" exitCode=0 Mar 19 12:08:04.420084 master-0 kubenswrapper[7642]: I0319 12:08:04.420117 7642 generic.go:334] "Generic (PLEG): container finished" podID="596a0930-a082-4374-acc9-b722c4970443" containerID="e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" exitCode=0 Mar 19 12:08:04.420084 master-0 kubenswrapper[7642]: I0319 12:08:04.420125 7642 generic.go:334] "Generic (PLEG): container finished" podID="596a0930-a082-4374-acc9-b722c4970443" containerID="55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" exitCode=2 Mar 19 12:08:04.420497 master-0 kubenswrapper[7642]: I0319 12:08:04.420146 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerDied","Data":"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc"} Mar 19 12:08:04.420497 master-0 kubenswrapper[7642]: I0319 12:08:04.420173 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerDied","Data":"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788"} Mar 19 12:08:04.420497 master-0 kubenswrapper[7642]: I0319 12:08:04.420183 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerDied","Data":"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8"} Mar 19 12:08:04.420497 master-0 kubenswrapper[7642]: I0319 12:08:04.420192 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" event={"ID":"596a0930-a082-4374-acc9-b722c4970443","Type":"ContainerDied","Data":"504567c8e122d9fd1893a9f58d1661f6e869dcbb3088f4af75c6080ae6904a38"} Mar 19 12:08:04.420497 master-0 kubenswrapper[7642]: I0319 12:08:04.420209 7642 scope.go:117] "RemoveContainer" containerID="672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" Mar 19 12:08:04.420497 master-0 kubenswrapper[7642]: I0319 12:08:04.420215 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b685db9f8-rjdjl" Mar 19 12:08:04.435590 master-0 kubenswrapper[7642]: I0319 12:08:04.435081 7642 scope.go:117] "RemoveContainer" containerID="e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" Mar 19 12:08:04.449943 master-0 kubenswrapper[7642]: I0319 12:08:04.449746 7642 scope.go:117] "RemoveContainer" containerID="55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" Mar 19 12:08:04.469805 master-0 kubenswrapper[7642]: I0319 12:08:04.464873 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-rjdjl"] Mar 19 12:08:04.469805 master-0 kubenswrapper[7642]: I0319 12:08:04.468235 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-rjdjl"] Mar 19 12:08:04.479979 master-0 kubenswrapper[7642]: I0319 12:08:04.479852 7642 scope.go:117] "RemoveContainer" containerID="672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" Mar 19 12:08:04.480511 master-0 kubenswrapper[7642]: E0319 12:08:04.480421 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": container with ID starting with 672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc not found: ID does not exist" containerID="672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" Mar 19 12:08:04.480592 master-0 kubenswrapper[7642]: I0319 12:08:04.480508 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc"} err="failed to get container status \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": rpc error: code = NotFound desc = could not find container \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": container with ID starting with 672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc not found: ID does not exist" Mar 19 12:08:04.480592 master-0 kubenswrapper[7642]: I0319 12:08:04.480538 7642 scope.go:117] "RemoveContainer" containerID="e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" Mar 19 12:08:04.480895 master-0 kubenswrapper[7642]: E0319 12:08:04.480853 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": container with ID starting with e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788 not found: ID does not exist" containerID="e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" Mar 19 12:08:04.480895 master-0 kubenswrapper[7642]: I0319 12:08:04.480881 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788"} err="failed to get container status \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": rpc error: code = NotFound desc = could not find container \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": container with ID starting with e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788 not found: ID does not exist" Mar 19 12:08:04.480895 master-0 kubenswrapper[7642]: I0319 12:08:04.480895 7642 scope.go:117] "RemoveContainer" containerID="55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" Mar 19 12:08:04.481254 master-0 kubenswrapper[7642]: E0319 12:08:04.481198 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": container with ID starting with 55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8 not found: ID does not exist" containerID="55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" Mar 19 12:08:04.481254 master-0 kubenswrapper[7642]: I0319 12:08:04.481212 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8"} err="failed to get container status \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": rpc error: code = NotFound desc = could not find container \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": container with ID starting with 55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8 not found: ID does not exist" Mar 19 12:08:04.481254 master-0 kubenswrapper[7642]: I0319 12:08:04.481224 7642 scope.go:117] "RemoveContainer" containerID="672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" Mar 19 12:08:04.481881 master-0 kubenswrapper[7642]: I0319 12:08:04.481709 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc"} err="failed to get container status \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": rpc error: code = NotFound desc = could not find container \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": container with ID starting with 672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc not found: ID does not exist" Mar 19 12:08:04.481881 master-0 kubenswrapper[7642]: I0319 12:08:04.481872 7642 scope.go:117] "RemoveContainer" containerID="e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" Mar 19 12:08:04.482453 master-0 kubenswrapper[7642]: I0319 12:08:04.482384 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788"} err="failed to get container status \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": rpc error: code = NotFound desc = could not find container \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": container with ID starting with e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788 not found: ID does not exist" Mar 19 12:08:04.482453 master-0 kubenswrapper[7642]: I0319 12:08:04.482446 7642 scope.go:117] "RemoveContainer" containerID="55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" Mar 19 12:08:04.482855 master-0 kubenswrapper[7642]: I0319 12:08:04.482811 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8"} err="failed to get container status \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": rpc error: code = NotFound desc = could not find container \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": container with ID starting with 55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8 not found: ID does not exist" Mar 19 12:08:04.482855 master-0 kubenswrapper[7642]: I0319 12:08:04.482841 7642 scope.go:117] "RemoveContainer" containerID="672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc" Mar 19 12:08:04.483183 master-0 kubenswrapper[7642]: I0319 12:08:04.483139 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc"} err="failed to get container status \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": rpc error: code = NotFound desc = could not find container \"672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc\": container with ID starting with 672638eb54d533e740bade8061315452eedf219fb7a765bd6d04c5542cc09ccc not found: ID does not exist" Mar 19 12:08:04.483183 master-0 kubenswrapper[7642]: I0319 12:08:04.483169 7642 scope.go:117] "RemoveContainer" containerID="e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788" Mar 19 12:08:04.483609 master-0 kubenswrapper[7642]: I0319 12:08:04.483511 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788"} err="failed to get container status \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": rpc error: code = NotFound desc = could not find container \"e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788\": container with ID starting with e8a13331db7b9e4080e26b500a06ae41e9ac4dd9e727e7be9e0c03020821f788 not found: ID does not exist" Mar 19 12:08:04.483609 master-0 kubenswrapper[7642]: I0319 12:08:04.483542 7642 scope.go:117] "RemoveContainer" containerID="55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8" Mar 19 12:08:04.483985 master-0 kubenswrapper[7642]: I0319 12:08:04.483947 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8"} err="failed to get container status \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": rpc error: code = NotFound desc = could not find container \"55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8\": container with ID starting with 55eb3e757635d0dcc7da03d58b984eb8d37cf03aa809929c2e05deb0eb7d5ed8 not found: ID does not exist" Mar 19 12:08:04.494670 master-0 kubenswrapper[7642]: I0319 12:08:04.494599 7642 reconciler_common.go:293] "Volume detached for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client-kube-rbac-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.494990 master-0 kubenswrapper[7642]: I0319 12:08:04.494973 7642 reconciler_common.go:293] "Volume detached for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-federate-client-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.495065 master-0 kubenswrapper[7642]: I0319 12:08:04.495055 7642 reconciler_common.go:293] "Volume detached for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-telemeter-client-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.495123 master-0 kubenswrapper[7642]: I0319 12:08:04.495113 7642 reconciler_common.go:293] "Volume detached for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/596a0930-a082-4374-acc9-b722c4970443-secret-telemeter-client\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.495188 master-0 kubenswrapper[7642]: I0319 12:08:04.495178 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwx2w\" (UniqueName: \"kubernetes.io/projected/596a0930-a082-4374-acc9-b722c4970443-kube-api-access-xwx2w\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:04.971117 master-0 kubenswrapper[7642]: I0319 12:08:04.970974 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:04.971117 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:04.971117 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:04.971117 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:04.971466 master-0 kubenswrapper[7642]: I0319 12:08:04.971145 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:05.972833 master-0 kubenswrapper[7642]: I0319 12:08:05.972788 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:05.972833 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:05.972833 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:05.972833 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:05.973665 master-0 kubenswrapper[7642]: I0319 12:08:05.972854 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:06.075223 master-0 kubenswrapper[7642]: I0319 12:08:06.075162 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596a0930-a082-4374-acc9-b722c4970443" path="/var/lib/kubelet/pods/596a0930-a082-4374-acc9-b722c4970443/volumes" Mar 19 12:08:06.973700 master-0 kubenswrapper[7642]: I0319 12:08:06.973511 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:06.973700 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:06.973700 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:06.973700 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:06.973700 master-0 kubenswrapper[7642]: I0319 12:08:06.973625 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:07.067957 master-0 kubenswrapper[7642]: I0319 12:08:07.067898 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:08:07.068176 master-0 kubenswrapper[7642]: E0319 12:08:07.068152 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:08:07.971925 master-0 kubenswrapper[7642]: I0319 12:08:07.971835 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:07.971925 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:07.971925 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:07.971925 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:07.972356 master-0 kubenswrapper[7642]: I0319 12:08:07.971962 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:08.972114 master-0 kubenswrapper[7642]: I0319 12:08:08.972038 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:08.972114 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:08.972114 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:08.972114 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:08.972884 master-0 kubenswrapper[7642]: I0319 12:08:08.972143 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:09.972577 master-0 kubenswrapper[7642]: I0319 12:08:09.972511 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:09.972577 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:09.972577 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:09.972577 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:09.973210 master-0 kubenswrapper[7642]: I0319 12:08:09.972600 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:10.971269 master-0 kubenswrapper[7642]: I0319 12:08:10.971196 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:10.971269 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:10.971269 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:10.971269 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:10.971269 master-0 kubenswrapper[7642]: I0319 12:08:10.971259 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:11.970560 master-0 kubenswrapper[7642]: I0319 12:08:11.970510 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:11.970560 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:11.970560 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:11.970560 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:11.971135 master-0 kubenswrapper[7642]: I0319 12:08:11.970568 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:12.971463 master-0 kubenswrapper[7642]: I0319 12:08:12.971400 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:12.971463 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:12.971463 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:12.971463 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:12.972310 master-0 kubenswrapper[7642]: I0319 12:08:12.971478 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:13.971247 master-0 kubenswrapper[7642]: I0319 12:08:13.971161 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:13.971247 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:13.971247 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:13.971247 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:13.971936 master-0 kubenswrapper[7642]: I0319 12:08:13.971290 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:14.972842 master-0 kubenswrapper[7642]: I0319 12:08:14.972761 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:14.972842 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:14.972842 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:14.972842 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:14.973835 master-0 kubenswrapper[7642]: I0319 12:08:14.972874 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:15.068167 master-0 kubenswrapper[7642]: I0319 12:08:15.068095 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:08:15.068167 master-0 kubenswrapper[7642]: I0319 12:08:15.068144 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:08:15.093423 master-0 kubenswrapper[7642]: I0319 12:08:15.093296 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:08:15.094238 master-0 kubenswrapper[7642]: I0319 12:08:15.094183 7642 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 19 12:08:15.110752 master-0 kubenswrapper[7642]: I0319 12:08:15.110679 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:08:15.137744 master-0 kubenswrapper[7642]: I0319 12:08:15.129533 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 19 12:08:15.506765 master-0 kubenswrapper[7642]: I0319 12:08:15.506619 7642 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:08:15.506765 master-0 kubenswrapper[7642]: I0319 12:08:15.506717 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="f633004a-5646-432b-8a60-562c132bc677" Mar 19 12:08:15.971841 master-0 kubenswrapper[7642]: I0319 12:08:15.971764 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:15.971841 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:15.971841 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:15.971841 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:15.972246 master-0 kubenswrapper[7642]: I0319 12:08:15.971858 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:16.972426 master-0 kubenswrapper[7642]: I0319 12:08:16.972362 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:16.972426 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:16.972426 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:16.972426 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:16.973434 master-0 kubenswrapper[7642]: I0319 12:08:16.972434 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:17.068946 master-0 kubenswrapper[7642]: I0319 12:08:17.068833 7642 scope.go:117] "RemoveContainer" containerID="9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c" Mar 19 12:08:17.523090 master-0 kubenswrapper[7642]: I0319 12:08:17.522980 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/3.log" Mar 19 12:08:17.523090 master-0 kubenswrapper[7642]: I0319 12:08:17.523043 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"02b3249f9637d6e6358afd5f27698b229ac5de189524a1cbe5bec5cb0b7e2803"} Mar 19 12:08:17.604659 master-0 kubenswrapper[7642]: I0319 12:08:17.598953 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=2.5989255030000002 podStartE2EDuration="2.598925503s" podCreationTimestamp="2026-03-19 12:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:08:17.588260998 +0000 UTC m=+855.705701845" watchObservedRunningTime="2026-03-19 12:08:17.598925503 +0000 UTC m=+855.716366320" Mar 19 12:08:17.974835 master-0 kubenswrapper[7642]: I0319 12:08:17.974748 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:17.974835 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:17.974835 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:17.974835 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:17.975891 master-0 kubenswrapper[7642]: I0319 12:08:17.974844 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:18.067932 master-0 kubenswrapper[7642]: I0319 12:08:18.067848 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:08:18.068217 master-0 kubenswrapper[7642]: E0319 12:08:18.068187 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:08:18.970634 master-0 kubenswrapper[7642]: I0319 12:08:18.970551 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:18.970634 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:18.970634 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:18.970634 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:18.971076 master-0 kubenswrapper[7642]: I0319 12:08:18.970654 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:19.972205 master-0 kubenswrapper[7642]: I0319 12:08:19.972123 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:19.972205 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:19.972205 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:19.972205 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:19.972890 master-0 kubenswrapper[7642]: I0319 12:08:19.972234 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:20.972020 master-0 kubenswrapper[7642]: I0319 12:08:20.971940 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:20.972020 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:20.972020 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:20.972020 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:20.973097 master-0 kubenswrapper[7642]: I0319 12:08:20.972046 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:21.972067 master-0 kubenswrapper[7642]: I0319 12:08:21.972009 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:21.972067 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:21.972067 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:21.972067 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:21.972811 master-0 kubenswrapper[7642]: I0319 12:08:21.972082 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:22.971137 master-0 kubenswrapper[7642]: I0319 12:08:22.971071 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:22.971137 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:22.971137 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:22.971137 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:22.971435 master-0 kubenswrapper[7642]: I0319 12:08:22.971150 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:23.971645 master-0 kubenswrapper[7642]: I0319 12:08:23.971573 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:23.971645 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:23.971645 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:23.971645 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:23.972275 master-0 kubenswrapper[7642]: I0319 12:08:23.971649 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:24.971664 master-0 kubenswrapper[7642]: I0319 12:08:24.971547 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:24.971664 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:24.971664 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:24.971664 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:24.972438 master-0 kubenswrapper[7642]: I0319 12:08:24.971729 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:25.971779 master-0 kubenswrapper[7642]: I0319 12:08:25.971717 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:25.971779 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:25.971779 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:25.971779 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:25.972374 master-0 kubenswrapper[7642]: I0319 12:08:25.971809 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:26.971471 master-0 kubenswrapper[7642]: I0319 12:08:26.971424 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:26.971471 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:26.971471 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:26.971471 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:26.971827 master-0 kubenswrapper[7642]: I0319 12:08:26.971481 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:27.238849 master-0 kubenswrapper[7642]: I0319 12:08:27.238712 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 12:08:27.239118 master-0 kubenswrapper[7642]: E0319 12:08:27.239073 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="kube-rbac-proxy" Mar 19 12:08:27.239118 master-0 kubenswrapper[7642]: I0319 12:08:27.239100 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="kube-rbac-proxy" Mar 19 12:08:27.239224 master-0 kubenswrapper[7642]: E0319 12:08:27.239130 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerName="installer" Mar 19 12:08:27.239224 master-0 kubenswrapper[7642]: I0319 12:08:27.239145 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerName="installer" Mar 19 12:08:27.239224 master-0 kubenswrapper[7642]: E0319 12:08:27.239178 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="reload" Mar 19 12:08:27.239224 master-0 kubenswrapper[7642]: I0319 12:08:27.239191 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="reload" Mar 19 12:08:27.239400 master-0 kubenswrapper[7642]: E0319 12:08:27.239237 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="telemeter-client" Mar 19 12:08:27.239400 master-0 kubenswrapper[7642]: I0319 12:08:27.239255 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="telemeter-client" Mar 19 12:08:27.239491 master-0 kubenswrapper[7642]: I0319 12:08:27.239467 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="telemeter-client" Mar 19 12:08:27.239542 master-0 kubenswrapper[7642]: I0319 12:08:27.239503 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerName="installer" Mar 19 12:08:27.239542 master-0 kubenswrapper[7642]: I0319 12:08:27.239523 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="reload" Mar 19 12:08:27.239635 master-0 kubenswrapper[7642]: I0319 12:08:27.239543 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="596a0930-a082-4374-acc9-b722c4970443" containerName="kube-rbac-proxy" Mar 19 12:08:27.240462 master-0 kubenswrapper[7642]: I0319 12:08:27.240414 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.242721 master-0 kubenswrapper[7642]: I0319 12:08:27.242679 7642 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hzxzv" Mar 19 12:08:27.243487 master-0 kubenswrapper[7642]: I0319 12:08:27.243447 7642 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:08:27.248298 master-0 kubenswrapper[7642]: I0319 12:08:27.248199 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 12:08:27.311590 master-0 kubenswrapper[7642]: I0319 12:08:27.311462 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db92adf4-f1af-4835-a75c-5e8f63f88883-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.311934 master-0 kubenswrapper[7642]: I0319 12:08:27.311731 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.311934 master-0 kubenswrapper[7642]: I0319 12:08:27.311802 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.413276 master-0 kubenswrapper[7642]: I0319 12:08:27.413192 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db92adf4-f1af-4835-a75c-5e8f63f88883-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.413523 master-0 kubenswrapper[7642]: I0319 12:08:27.413298 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.413523 master-0 kubenswrapper[7642]: I0319 12:08:27.413330 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.413592 master-0 kubenswrapper[7642]: I0319 12:08:27.413523 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.413703 master-0 kubenswrapper[7642]: I0319 12:08:27.413625 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.432252 master-0 kubenswrapper[7642]: I0319 12:08:27.432193 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db92adf4-f1af-4835-a75c-5e8f63f88883-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.579776 master-0 kubenswrapper[7642]: I0319 12:08:27.579585 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:27.971550 master-0 kubenswrapper[7642]: I0319 12:08:27.971421 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:27.971550 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:27.971550 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:27.971550 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:27.971550 master-0 kubenswrapper[7642]: I0319 12:08:27.971530 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:27.981962 master-0 kubenswrapper[7642]: I0319 12:08:27.981827 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 12:08:27.993274 master-0 kubenswrapper[7642]: W0319 12:08:27.993200 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddb92adf4_f1af_4835_a75c_5e8f63f88883.slice/crio-a7846a09fe07bc01645f478e3c80661b723e32e9f6ed0e704679f39070079ae5 WatchSource:0}: Error finding container a7846a09fe07bc01645f478e3c80661b723e32e9f6ed0e704679f39070079ae5: Status 404 returned error can't find the container with id a7846a09fe07bc01645f478e3c80661b723e32e9f6ed0e704679f39070079ae5 Mar 19 12:08:28.601215 master-0 kubenswrapper[7642]: I0319 12:08:28.601081 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"db92adf4-f1af-4835-a75c-5e8f63f88883","Type":"ContainerStarted","Data":"87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb"} Mar 19 12:08:28.601215 master-0 kubenswrapper[7642]: I0319 12:08:28.601133 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"db92adf4-f1af-4835-a75c-5e8f63f88883","Type":"ContainerStarted","Data":"a7846a09fe07bc01645f478e3c80661b723e32e9f6ed0e704679f39070079ae5"} Mar 19 12:08:28.623743 master-0 kubenswrapper[7642]: I0319 12:08:28.623679 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=1.6236616860000002 podStartE2EDuration="1.623661686s" podCreationTimestamp="2026-03-19 12:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:08:28.620737713 +0000 UTC m=+866.738178550" watchObservedRunningTime="2026-03-19 12:08:28.623661686 +0000 UTC m=+866.741102473" Mar 19 12:08:28.971055 master-0 kubenswrapper[7642]: I0319 12:08:28.970923 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:28.971055 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:28.971055 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:28.971055 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:28.971055 master-0 kubenswrapper[7642]: I0319 12:08:28.971005 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:29.971831 master-0 kubenswrapper[7642]: I0319 12:08:29.971734 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:29.971831 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:29.971831 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:29.971831 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:29.972563 master-0 kubenswrapper[7642]: I0319 12:08:29.971823 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:30.067614 master-0 kubenswrapper[7642]: I0319 12:08:30.067561 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:08:30.068041 master-0 kubenswrapper[7642]: E0319 12:08:30.067942 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:08:30.972629 master-0 kubenswrapper[7642]: I0319 12:08:30.972537 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:30.972629 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:30.972629 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:30.972629 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:30.973366 master-0 kubenswrapper[7642]: I0319 12:08:30.972693 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:31.972075 master-0 kubenswrapper[7642]: I0319 12:08:31.971810 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:31.972075 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:31.972075 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:31.972075 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:31.972075 master-0 kubenswrapper[7642]: I0319 12:08:31.971919 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:32.971874 master-0 kubenswrapper[7642]: I0319 12:08:32.971739 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:32.971874 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:32.971874 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:32.971874 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:32.972238 master-0 kubenswrapper[7642]: I0319 12:08:32.971888 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:33.232888 master-0 kubenswrapper[7642]: I0319 12:08:33.232276 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 12:08:33.232888 master-0 kubenswrapper[7642]: I0319 12:08:33.232547 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podUID="db92adf4-f1af-4835-a75c-5e8f63f88883" containerName="installer" containerID="cri-o://87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb" gracePeriod=30 Mar 19 12:08:33.971400 master-0 kubenswrapper[7642]: I0319 12:08:33.971311 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:33.971400 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:33.971400 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:33.971400 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:33.971400 master-0 kubenswrapper[7642]: I0319 12:08:33.971411 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:34.972566 master-0 kubenswrapper[7642]: I0319 12:08:34.972477 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:34.972566 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:34.972566 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:34.972566 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:34.973435 master-0 kubenswrapper[7642]: I0319 12:08:34.972590 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:35.971336 master-0 kubenswrapper[7642]: I0319 12:08:35.971230 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:35.971336 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:35.971336 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:35.971336 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:35.971336 master-0 kubenswrapper[7642]: I0319 12:08:35.971355 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:36.971416 master-0 kubenswrapper[7642]: I0319 12:08:36.971347 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:36.971416 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:36.971416 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:36.971416 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:36.971416 master-0 kubenswrapper[7642]: I0319 12:08:36.971416 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:37.834330 master-0 kubenswrapper[7642]: I0319 12:08:37.834220 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 12:08:37.835273 master-0 kubenswrapper[7642]: I0319 12:08:37.835230 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:37.848273 master-0 kubenswrapper[7642]: I0319 12:08:37.848220 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 12:08:37.972077 master-0 kubenswrapper[7642]: I0319 12:08:37.972019 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-var-lock\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:37.972077 master-0 kubenswrapper[7642]: I0319 12:08:37.972075 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:37.972789 master-0 kubenswrapper[7642]: I0319 12:08:37.972146 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:37.972789 master-0 kubenswrapper[7642]: I0319 12:08:37.972503 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:37.972789 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:37.972789 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:37.972789 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:37.972789 master-0 kubenswrapper[7642]: I0319 12:08:37.972561 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:38.073837 master-0 kubenswrapper[7642]: I0319 12:08:38.073730 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.074313 master-0 kubenswrapper[7642]: I0319 12:08:38.074292 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-var-lock\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.074438 master-0 kubenswrapper[7642]: I0319 12:08:38.074374 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-var-lock\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.074615 master-0 kubenswrapper[7642]: I0319 12:08:38.074572 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.074743 master-0 kubenswrapper[7642]: I0319 12:08:38.074722 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.094747 master-0 kubenswrapper[7642]: I0319 12:08:38.094566 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.180901 master-0 kubenswrapper[7642]: I0319 12:08:38.180856 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:38.597682 master-0 kubenswrapper[7642]: I0319 12:08:38.597601 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 12:08:38.696293 master-0 kubenswrapper[7642]: I0319 12:08:38.696244 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1a4e269e-d21f-40b3-b142-54bbec1aeea6","Type":"ContainerStarted","Data":"6b2ed5cde5a96bd04e176a193d80c2f31f98249e076eb0ad51ab27f895c2af42"} Mar 19 12:08:38.975603 master-0 kubenswrapper[7642]: I0319 12:08:38.975534 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:38.975603 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:38.975603 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:38.975603 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:38.975603 master-0 kubenswrapper[7642]: I0319 12:08:38.975597 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:39.705838 master-0 kubenswrapper[7642]: I0319 12:08:39.705736 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1a4e269e-d21f-40b3-b142-54bbec1aeea6","Type":"ContainerStarted","Data":"1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a"} Mar 19 12:08:39.728000 master-0 kubenswrapper[7642]: I0319 12:08:39.727889 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.727871513 podStartE2EDuration="2.727871513s" podCreationTimestamp="2026-03-19 12:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:08:39.72568463 +0000 UTC m=+877.843125427" watchObservedRunningTime="2026-03-19 12:08:39.727871513 +0000 UTC m=+877.845312300" Mar 19 12:08:39.972754 master-0 kubenswrapper[7642]: I0319 12:08:39.972567 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:39.972754 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:39.972754 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:39.972754 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:39.972754 master-0 kubenswrapper[7642]: I0319 12:08:39.972686 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:40.971974 master-0 kubenswrapper[7642]: I0319 12:08:40.971908 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:40.971974 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:40.971974 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:40.971974 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:40.972622 master-0 kubenswrapper[7642]: I0319 12:08:40.971987 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:41.972899 master-0 kubenswrapper[7642]: I0319 12:08:41.972805 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:41.972899 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:41.972899 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:41.972899 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:41.972899 master-0 kubenswrapper[7642]: I0319 12:08:41.972886 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:42.971029 master-0 kubenswrapper[7642]: I0319 12:08:42.970958 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:42.971029 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:42.971029 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:42.971029 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:42.971551 master-0 kubenswrapper[7642]: I0319 12:08:42.971033 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:43.067888 master-0 kubenswrapper[7642]: I0319 12:08:43.067819 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:08:43.068428 master-0 kubenswrapper[7642]: E0319 12:08:43.068277 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:08:43.971835 master-0 kubenswrapper[7642]: I0319 12:08:43.971778 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:43.971835 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:43.971835 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:43.971835 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:43.972166 master-0 kubenswrapper[7642]: I0319 12:08:43.971862 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:44.971476 master-0 kubenswrapper[7642]: I0319 12:08:44.971404 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:44.971476 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:44.971476 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:44.971476 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:44.971476 master-0 kubenswrapper[7642]: I0319 12:08:44.971464 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:45.971041 master-0 kubenswrapper[7642]: I0319 12:08:45.970999 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:45.971041 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:45.971041 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:45.971041 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:45.971413 master-0 kubenswrapper[7642]: I0319 12:08:45.971389 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:46.971781 master-0 kubenswrapper[7642]: I0319 12:08:46.971717 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:46.971781 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:46.971781 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:46.971781 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:46.972427 master-0 kubenswrapper[7642]: I0319 12:08:46.971788 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:47.971969 master-0 kubenswrapper[7642]: I0319 12:08:47.971896 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:47.971969 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:47.971969 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:47.971969 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:47.972693 master-0 kubenswrapper[7642]: I0319 12:08:47.972003 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:48.973462 master-0 kubenswrapper[7642]: I0319 12:08:48.973405 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:48.973462 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:48.973462 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:48.973462 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:48.974085 master-0 kubenswrapper[7642]: I0319 12:08:48.973479 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:49.971256 master-0 kubenswrapper[7642]: I0319 12:08:49.971170 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:49.971256 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:49.971256 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:49.971256 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:49.971583 master-0 kubenswrapper[7642]: I0319 12:08:49.971283 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:50.228883 master-0 kubenswrapper[7642]: I0319 12:08:50.228743 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 12:08:50.229402 master-0 kubenswrapper[7642]: I0319 12:08:50.228931 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="1a4e269e-d21f-40b3-b142-54bbec1aeea6" containerName="installer" containerID="cri-o://1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a" gracePeriod=30 Mar 19 12:08:50.659109 master-0 kubenswrapper[7642]: I0319 12:08:50.658574 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_1a4e269e-d21f-40b3-b142-54bbec1aeea6/installer/0.log" Mar 19 12:08:50.659109 master-0 kubenswrapper[7642]: I0319 12:08:50.658684 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:50.757718 master-0 kubenswrapper[7642]: I0319 12:08:50.757536 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kube-api-access\") pod \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " Mar 19 12:08:50.757718 master-0 kubenswrapper[7642]: I0319 12:08:50.757628 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-var-lock\") pod \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " Mar 19 12:08:50.758040 master-0 kubenswrapper[7642]: I0319 12:08:50.757827 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kubelet-dir\") pod \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\" (UID: \"1a4e269e-d21f-40b3-b142-54bbec1aeea6\") " Mar 19 12:08:50.758040 master-0 kubenswrapper[7642]: I0319 12:08:50.757943 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-var-lock" (OuterVolumeSpecName: "var-lock") pod "1a4e269e-d21f-40b3-b142-54bbec1aeea6" (UID: "1a4e269e-d21f-40b3-b142-54bbec1aeea6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:08:50.758557 master-0 kubenswrapper[7642]: I0319 12:08:50.757974 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1a4e269e-d21f-40b3-b142-54bbec1aeea6" (UID: "1a4e269e-d21f-40b3-b142-54bbec1aeea6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:08:50.758557 master-0 kubenswrapper[7642]: I0319 12:08:50.758213 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:50.762088 master-0 kubenswrapper[7642]: I0319 12:08:50.761994 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1a4e269e-d21f-40b3-b142-54bbec1aeea6" (UID: "1a4e269e-d21f-40b3-b142-54bbec1aeea6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:08:50.786272 master-0 kubenswrapper[7642]: I0319 12:08:50.786204 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_1a4e269e-d21f-40b3-b142-54bbec1aeea6/installer/0.log" Mar 19 12:08:50.786425 master-0 kubenswrapper[7642]: I0319 12:08:50.786323 7642 generic.go:334] "Generic (PLEG): container finished" podID="1a4e269e-d21f-40b3-b142-54bbec1aeea6" containerID="1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a" exitCode=1 Mar 19 12:08:50.786425 master-0 kubenswrapper[7642]: I0319 12:08:50.786368 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1a4e269e-d21f-40b3-b142-54bbec1aeea6","Type":"ContainerDied","Data":"1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a"} Mar 19 12:08:50.786425 master-0 kubenswrapper[7642]: I0319 12:08:50.786405 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"1a4e269e-d21f-40b3-b142-54bbec1aeea6","Type":"ContainerDied","Data":"6b2ed5cde5a96bd04e176a193d80c2f31f98249e076eb0ad51ab27f895c2af42"} Mar 19 12:08:50.786564 master-0 kubenswrapper[7642]: I0319 12:08:50.786432 7642 scope.go:117] "RemoveContainer" containerID="1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a" Mar 19 12:08:50.786564 master-0 kubenswrapper[7642]: I0319 12:08:50.786439 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 19 12:08:50.804756 master-0 kubenswrapper[7642]: I0319 12:08:50.804706 7642 scope.go:117] "RemoveContainer" containerID="1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a" Mar 19 12:08:50.805232 master-0 kubenswrapper[7642]: E0319 12:08:50.805187 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a\": container with ID starting with 1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a not found: ID does not exist" containerID="1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a" Mar 19 12:08:50.805334 master-0 kubenswrapper[7642]: I0319 12:08:50.805236 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a"} err="failed to get container status \"1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a\": rpc error: code = NotFound desc = could not find container \"1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a\": container with ID starting with 1f574102d8e0d1dd204a29b5df4c632223027531a716fee3a4d5442279e2816a not found: ID does not exist" Mar 19 12:08:50.843130 master-0 kubenswrapper[7642]: I0319 12:08:50.841190 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 12:08:50.846001 master-0 kubenswrapper[7642]: I0319 12:08:50.845923 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 19 12:08:50.859906 master-0 kubenswrapper[7642]: I0319 12:08:50.859705 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:50.859906 master-0 kubenswrapper[7642]: I0319 12:08:50.859758 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a4e269e-d21f-40b3-b142-54bbec1aeea6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:50.972094 master-0 kubenswrapper[7642]: I0319 12:08:50.972006 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:50.972094 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:50.972094 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:50.972094 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:50.972544 master-0 kubenswrapper[7642]: I0319 12:08:50.972102 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:51.971400 master-0 kubenswrapper[7642]: I0319 12:08:51.971323 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:51.971400 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:51.971400 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:51.971400 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:51.972426 master-0 kubenswrapper[7642]: I0319 12:08:51.972395 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:52.075996 master-0 kubenswrapper[7642]: I0319 12:08:52.075926 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4e269e-d21f-40b3-b142-54bbec1aeea6" path="/var/lib/kubelet/pods/1a4e269e-d21f-40b3-b142-54bbec1aeea6/volumes" Mar 19 12:08:52.971792 master-0 kubenswrapper[7642]: I0319 12:08:52.971595 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:52.971792 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:52.971792 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:52.971792 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:52.972604 master-0 kubenswrapper[7642]: I0319 12:08:52.971872 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:53.972280 master-0 kubenswrapper[7642]: I0319 12:08:53.972176 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:53.972280 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:53.972280 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:53.972280 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:53.972280 master-0 kubenswrapper[7642]: I0319 12:08:53.972255 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:54.034361 master-0 kubenswrapper[7642]: I0319 12:08:54.034291 7642 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 12:08:54.034716 master-0 kubenswrapper[7642]: E0319 12:08:54.034631 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4e269e-d21f-40b3-b142-54bbec1aeea6" containerName="installer" Mar 19 12:08:54.034716 master-0 kubenswrapper[7642]: I0319 12:08:54.034666 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4e269e-d21f-40b3-b142-54bbec1aeea6" containerName="installer" Mar 19 12:08:54.034983 master-0 kubenswrapper[7642]: I0319 12:08:54.034815 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4e269e-d21f-40b3-b142-54bbec1aeea6" containerName="installer" Mar 19 12:08:54.035348 master-0 kubenswrapper[7642]: I0319 12:08:54.035314 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.052816 master-0 kubenswrapper[7642]: I0319 12:08:54.051343 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 12:08:54.105474 master-0 kubenswrapper[7642]: I0319 12:08:54.105396 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.105823 master-0 kubenswrapper[7642]: I0319 12:08:54.105801 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-var-lock\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.106011 master-0 kubenswrapper[7642]: I0319 12:08:54.105989 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kube-api-access\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.206792 master-0 kubenswrapper[7642]: I0319 12:08:54.206724 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-var-lock\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.206993 master-0 kubenswrapper[7642]: I0319 12:08:54.206817 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kube-api-access\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.206993 master-0 kubenswrapper[7642]: I0319 12:08:54.206883 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.207060 master-0 kubenswrapper[7642]: I0319 12:08:54.206999 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-var-lock\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.207205 master-0 kubenswrapper[7642]: I0319 12:08:54.207153 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.237522 master-0 kubenswrapper[7642]: I0319 12:08:54.237366 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kube-api-access\") pod \"installer-3-master-0\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.368550 master-0 kubenswrapper[7642]: I0319 12:08:54.368454 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:08:54.883715 master-0 kubenswrapper[7642]: I0319 12:08:54.870123 7642 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 19 12:08:54.971461 master-0 kubenswrapper[7642]: I0319 12:08:54.970989 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:54.971461 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:54.971461 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:54.971461 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:54.971461 master-0 kubenswrapper[7642]: I0319 12:08:54.971087 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:55.068917 master-0 kubenswrapper[7642]: I0319 12:08:55.068867 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:08:55.069492 master-0 kubenswrapper[7642]: E0319 12:08:55.069075 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:08:55.835382 master-0 kubenswrapper[7642]: I0319 12:08:55.835309 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"838a469d-2c27-46db-9d6f-17bfa1c8dd09","Type":"ContainerStarted","Data":"bd4cbdd654e294cf12e1078774a4064deac39d97874d1f397bebdf9625d6fe36"} Mar 19 12:08:55.835382 master-0 kubenswrapper[7642]: I0319 12:08:55.835370 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"838a469d-2c27-46db-9d6f-17bfa1c8dd09","Type":"ContainerStarted","Data":"7d03f7fa3a14d9d32f0d91cb4cc6bc9bd274055b488260ab0433e18c2052b2ac"} Mar 19 12:08:55.863799 master-0 kubenswrapper[7642]: I0319 12:08:55.863670 7642 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=1.863623211 podStartE2EDuration="1.863623211s" podCreationTimestamp="2026-03-19 12:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:08:55.860514842 +0000 UTC m=+893.977955669" watchObservedRunningTime="2026-03-19 12:08:55.863623211 +0000 UTC m=+893.981064018" Mar 19 12:08:55.971405 master-0 kubenswrapper[7642]: I0319 12:08:55.971350 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:55.971405 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:55.971405 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:55.971405 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:55.971700 master-0 kubenswrapper[7642]: I0319 12:08:55.971434 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:56.846535 master-0 kubenswrapper[7642]: I0319 12:08:56.846466 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/4.log" Mar 19 12:08:56.848574 master-0 kubenswrapper[7642]: I0319 12:08:56.847494 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/3.log" Mar 19 12:08:56.848574 master-0 kubenswrapper[7642]: I0319 12:08:56.848134 7642 generic.go:334] "Generic (PLEG): container finished" podID="f38df702-a256-4f83-90bb-0c0e477a2ce6" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" exitCode=1 Mar 19 12:08:56.848574 master-0 kubenswrapper[7642]: I0319 12:08:56.848174 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerDied","Data":"2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111"} Mar 19 12:08:56.848574 master-0 kubenswrapper[7642]: I0319 12:08:56.848391 7642 scope.go:117] "RemoveContainer" containerID="266d45496ee879b60eaf981de6871924a55070f988e506147e462e03b0fc4b42" Mar 19 12:08:56.849420 master-0 kubenswrapper[7642]: I0319 12:08:56.849339 7642 scope.go:117] "RemoveContainer" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" Mar 19 12:08:56.850196 master-0 kubenswrapper[7642]: E0319 12:08:56.850019 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:08:56.972873 master-0 kubenswrapper[7642]: I0319 12:08:56.972822 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:56.972873 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:56.972873 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:56.972873 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:56.973226 master-0 kubenswrapper[7642]: I0319 12:08:56.972886 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:57.860113 master-0 kubenswrapper[7642]: I0319 12:08:57.860034 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/4.log" Mar 19 12:08:57.972105 master-0 kubenswrapper[7642]: I0319 12:08:57.972002 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:57.972105 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:57.972105 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:57.972105 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:57.972619 master-0 kubenswrapper[7642]: I0319 12:08:57.972108 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:58.972593 master-0 kubenswrapper[7642]: I0319 12:08:58.972508 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:58.972593 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:58.972593 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:58.972593 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:58.972593 master-0 kubenswrapper[7642]: I0319 12:08:58.972578 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:08:59.324330 master-0 kubenswrapper[7642]: I0319 12:08:59.324283 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_db92adf4-f1af-4835-a75c-5e8f63f88883/installer/0.log" Mar 19 12:08:59.324610 master-0 kubenswrapper[7642]: I0319 12:08:59.324363 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:59.392655 master-0 kubenswrapper[7642]: I0319 12:08:59.392557 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db92adf4-f1af-4835-a75c-5e8f63f88883-kube-api-access\") pod \"db92adf4-f1af-4835-a75c-5e8f63f88883\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " Mar 19 12:08:59.392878 master-0 kubenswrapper[7642]: I0319 12:08:59.392675 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-kubelet-dir\") pod \"db92adf4-f1af-4835-a75c-5e8f63f88883\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " Mar 19 12:08:59.392878 master-0 kubenswrapper[7642]: I0319 12:08:59.392780 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-var-lock\") pod \"db92adf4-f1af-4835-a75c-5e8f63f88883\" (UID: \"db92adf4-f1af-4835-a75c-5e8f63f88883\") " Mar 19 12:08:59.393046 master-0 kubenswrapper[7642]: I0319 12:08:59.392978 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "db92adf4-f1af-4835-a75c-5e8f63f88883" (UID: "db92adf4-f1af-4835-a75c-5e8f63f88883"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:08:59.393092 master-0 kubenswrapper[7642]: I0319 12:08:59.393013 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-var-lock" (OuterVolumeSpecName: "var-lock") pod "db92adf4-f1af-4835-a75c-5e8f63f88883" (UID: "db92adf4-f1af-4835-a75c-5e8f63f88883"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:08:59.393203 master-0 kubenswrapper[7642]: I0319 12:08:59.393176 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:59.393203 master-0 kubenswrapper[7642]: I0319 12:08:59.393197 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/db92adf4-f1af-4835-a75c-5e8f63f88883-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:59.395464 master-0 kubenswrapper[7642]: I0319 12:08:59.395408 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db92adf4-f1af-4835-a75c-5e8f63f88883-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "db92adf4-f1af-4835-a75c-5e8f63f88883" (UID: "db92adf4-f1af-4835-a75c-5e8f63f88883"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:08:59.495048 master-0 kubenswrapper[7642]: I0319 12:08:59.494976 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db92adf4-f1af-4835-a75c-5e8f63f88883-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:08:59.882991 master-0 kubenswrapper[7642]: I0319 12:08:59.882934 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_db92adf4-f1af-4835-a75c-5e8f63f88883/installer/0.log" Mar 19 12:08:59.882991 master-0 kubenswrapper[7642]: I0319 12:08:59.882994 7642 generic.go:334] "Generic (PLEG): container finished" podID="db92adf4-f1af-4835-a75c-5e8f63f88883" containerID="87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb" exitCode=1 Mar 19 12:08:59.883383 master-0 kubenswrapper[7642]: I0319 12:08:59.883027 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"db92adf4-f1af-4835-a75c-5e8f63f88883","Type":"ContainerDied","Data":"87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb"} Mar 19 12:08:59.883383 master-0 kubenswrapper[7642]: I0319 12:08:59.883056 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"db92adf4-f1af-4835-a75c-5e8f63f88883","Type":"ContainerDied","Data":"a7846a09fe07bc01645f478e3c80661b723e32e9f6ed0e704679f39070079ae5"} Mar 19 12:08:59.883383 master-0 kubenswrapper[7642]: I0319 12:08:59.883076 7642 scope.go:117] "RemoveContainer" containerID="87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb" Mar 19 12:08:59.883383 master-0 kubenswrapper[7642]: I0319 12:08:59.883105 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 19 12:08:59.905016 master-0 kubenswrapper[7642]: I0319 12:08:59.904853 7642 scope.go:117] "RemoveContainer" containerID="87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb" Mar 19 12:08:59.905291 master-0 kubenswrapper[7642]: E0319 12:08:59.905230 7642 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb\": container with ID starting with 87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb not found: ID does not exist" containerID="87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb" Mar 19 12:08:59.905379 master-0 kubenswrapper[7642]: I0319 12:08:59.905284 7642 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb"} err="failed to get container status \"87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb\": rpc error: code = NotFound desc = could not find container \"87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb\": container with ID starting with 87fb641feb1b0d9b4dc1fceefe365d22825578bd3ce3490b3f755bcd5405d4eb not found: ID does not exist" Mar 19 12:08:59.942514 master-0 kubenswrapper[7642]: I0319 12:08:59.942418 7642 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 12:08:59.949801 master-0 kubenswrapper[7642]: I0319 12:08:59.949719 7642 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 19 12:08:59.971512 master-0 kubenswrapper[7642]: I0319 12:08:59.971450 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:08:59.971512 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:08:59.971512 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:08:59.971512 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:08:59.971512 master-0 kubenswrapper[7642]: I0319 12:08:59.971507 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:00.077308 master-0 kubenswrapper[7642]: I0319 12:09:00.077209 7642 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db92adf4-f1af-4835-a75c-5e8f63f88883" path="/var/lib/kubelet/pods/db92adf4-f1af-4835-a75c-5e8f63f88883/volumes" Mar 19 12:09:00.971692 master-0 kubenswrapper[7642]: I0319 12:09:00.971557 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:00.971692 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:00.971692 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:00.971692 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:00.971692 master-0 kubenswrapper[7642]: I0319 12:09:00.971664 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:01.972383 master-0 kubenswrapper[7642]: I0319 12:09:01.972275 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:01.972383 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:01.972383 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:01.972383 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:01.972383 master-0 kubenswrapper[7642]: I0319 12:09:01.972377 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:02.971972 master-0 kubenswrapper[7642]: I0319 12:09:02.971931 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:02.971972 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:02.971972 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:02.971972 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:02.972470 master-0 kubenswrapper[7642]: I0319 12:09:02.972427 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:03.972463 master-0 kubenswrapper[7642]: I0319 12:09:03.972394 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:03.972463 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:03.972463 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:03.972463 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:03.973300 master-0 kubenswrapper[7642]: I0319 12:09:03.973220 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:04.971185 master-0 kubenswrapper[7642]: I0319 12:09:04.971114 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:04.971185 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:04.971185 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:04.971185 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:04.971185 master-0 kubenswrapper[7642]: I0319 12:09:04.971175 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:05.972811 master-0 kubenswrapper[7642]: I0319 12:09:05.972736 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:05.972811 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:05.972811 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:05.972811 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:05.973495 master-0 kubenswrapper[7642]: I0319 12:09:05.972827 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:06.972509 master-0 kubenswrapper[7642]: I0319 12:09:06.972364 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:06.972509 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:06.972509 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:06.972509 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:06.972830 master-0 kubenswrapper[7642]: I0319 12:09:06.972490 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:07.970978 master-0 kubenswrapper[7642]: I0319 12:09:07.970904 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:07.970978 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:07.970978 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:07.970978 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:07.970978 master-0 kubenswrapper[7642]: I0319 12:09:07.970975 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:08.068189 master-0 kubenswrapper[7642]: I0319 12:09:08.068097 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:09:08.068588 master-0 kubenswrapper[7642]: E0319 12:09:08.068536 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:09:08.971794 master-0 kubenswrapper[7642]: I0319 12:09:08.971715 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:08.971794 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:08.971794 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:08.971794 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:08.971794 master-0 kubenswrapper[7642]: I0319 12:09:08.971793 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:09.971962 master-0 kubenswrapper[7642]: I0319 12:09:09.971869 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:09.971962 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:09.971962 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:09.971962 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:09.973752 master-0 kubenswrapper[7642]: I0319 12:09:09.971971 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:10.068582 master-0 kubenswrapper[7642]: I0319 12:09:10.068508 7642 scope.go:117] "RemoveContainer" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" Mar 19 12:09:10.068848 master-0 kubenswrapper[7642]: E0319 12:09:10.068822 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:09:10.970988 master-0 kubenswrapper[7642]: I0319 12:09:10.970904 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:10.970988 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:10.970988 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:10.970988 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:10.971468 master-0 kubenswrapper[7642]: I0319 12:09:10.970992 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:11.975827 master-0 kubenswrapper[7642]: I0319 12:09:11.975676 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:11.975827 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:11.975827 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:11.975827 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:11.975827 master-0 kubenswrapper[7642]: I0319 12:09:11.975761 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:12.971955 master-0 kubenswrapper[7642]: I0319 12:09:12.971890 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:12.971955 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:12.971955 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:12.971955 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:12.972369 master-0 kubenswrapper[7642]: I0319 12:09:12.971968 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:13.971247 master-0 kubenswrapper[7642]: I0319 12:09:13.971174 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:13.971247 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:13.971247 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:13.971247 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:13.971963 master-0 kubenswrapper[7642]: I0319 12:09:13.971316 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:14.971411 master-0 kubenswrapper[7642]: I0319 12:09:14.971348 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:14.971411 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:14.971411 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:14.971411 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:14.972188 master-0 kubenswrapper[7642]: I0319 12:09:14.971416 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:15.972706 master-0 kubenswrapper[7642]: I0319 12:09:15.972592 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:15.972706 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:15.972706 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:15.972706 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:15.972706 master-0 kubenswrapper[7642]: I0319 12:09:15.972708 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:16.972439 master-0 kubenswrapper[7642]: I0319 12:09:16.972343 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:16.972439 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:16.972439 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:16.972439 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:16.973848 master-0 kubenswrapper[7642]: I0319 12:09:16.972444 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:17.971443 master-0 kubenswrapper[7642]: I0319 12:09:17.971358 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:17.971443 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:17.971443 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:17.971443 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:17.971881 master-0 kubenswrapper[7642]: I0319 12:09:17.971456 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:18.971383 master-0 kubenswrapper[7642]: I0319 12:09:18.971308 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:18.971383 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:18.971383 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:18.971383 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:18.971383 master-0 kubenswrapper[7642]: I0319 12:09:18.971377 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:19.067616 master-0 kubenswrapper[7642]: I0319 12:09:19.067540 7642 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:09:19.973910 master-0 kubenswrapper[7642]: I0319 12:09:19.973832 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:19.973910 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:19.973910 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:19.973910 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:19.974469 master-0 kubenswrapper[7642]: I0319 12:09:19.973920 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:20.031566 master-0 kubenswrapper[7642]: I0319 12:09:20.031486 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:09:20.033221 master-0 kubenswrapper[7642]: I0319 12:09:20.033180 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:09:20.033290 master-0 kubenswrapper[7642]: I0319 12:09:20.033255 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"85ed437129c34727828994007c76166519f0d49369f91dbf9d5e4dc1693434d5"} Mar 19 12:09:20.971595 master-0 kubenswrapper[7642]: I0319 12:09:20.971537 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:20.971595 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:20.971595 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:20.971595 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:20.972107 master-0 kubenswrapper[7642]: I0319 12:09:20.972076 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:21.970812 master-0 kubenswrapper[7642]: I0319 12:09:21.970726 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:21.970812 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:21.970812 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:21.970812 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:21.970812 master-0 kubenswrapper[7642]: I0319 12:09:21.970804 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:22.972683 master-0 kubenswrapper[7642]: I0319 12:09:22.972583 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:22.972683 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:22.972683 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:22.972683 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:22.973758 master-0 kubenswrapper[7642]: I0319 12:09:22.972713 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:23.972009 master-0 kubenswrapper[7642]: I0319 12:09:23.971933 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:23.972009 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:23.972009 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:23.972009 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:23.972292 master-0 kubenswrapper[7642]: I0319 12:09:23.972025 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:24.068067 master-0 kubenswrapper[7642]: I0319 12:09:24.067959 7642 scope.go:117] "RemoveContainer" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" Mar 19 12:09:24.069104 master-0 kubenswrapper[7642]: E0319 12:09:24.068336 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:09:24.971958 master-0 kubenswrapper[7642]: I0319 12:09:24.971873 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:24.971958 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:24.971958 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:24.971958 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:24.972549 master-0 kubenswrapper[7642]: I0319 12:09:24.971972 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:25.135314 master-0 kubenswrapper[7642]: I0319 12:09:25.135235 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:25.136213 master-0 kubenswrapper[7642]: I0319 12:09:25.136149 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:25.140171 master-0 kubenswrapper[7642]: I0319 12:09:25.140136 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:25.971159 master-0 kubenswrapper[7642]: I0319 12:09:25.971029 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:25.971159 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:25.971159 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:25.971159 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:25.971159 master-0 kubenswrapper[7642]: I0319 12:09:25.971090 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:26.971084 master-0 kubenswrapper[7642]: I0319 12:09:26.971018 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:26.971084 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:26.971084 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:26.971084 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:26.971679 master-0 kubenswrapper[7642]: I0319 12:09:26.971101 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:27.971418 master-0 kubenswrapper[7642]: I0319 12:09:27.971359 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:27.971418 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:27.971418 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:27.971418 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:27.972264 master-0 kubenswrapper[7642]: I0319 12:09:27.971440 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:28.971460 master-0 kubenswrapper[7642]: I0319 12:09:28.971350 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:28.971460 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:28.971460 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:28.971460 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:28.971460 master-0 kubenswrapper[7642]: I0319 12:09:28.971435 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:29.972033 master-0 kubenswrapper[7642]: I0319 12:09:29.971910 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:29.972033 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:29.972033 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:29.972033 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:29.972778 master-0 kubenswrapper[7642]: I0319 12:09:29.972078 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:30.971946 master-0 kubenswrapper[7642]: I0319 12:09:30.971849 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:30.971946 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:30.971946 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:30.971946 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:30.971946 master-0 kubenswrapper[7642]: I0319 12:09:30.971945 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:31.971107 master-0 kubenswrapper[7642]: I0319 12:09:31.971006 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:31.971107 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:31.971107 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:31.971107 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:31.971606 master-0 kubenswrapper[7642]: I0319 12:09:31.971121 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:32.971381 master-0 kubenswrapper[7642]: I0319 12:09:32.971284 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:32.971381 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:32.971381 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:32.971381 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:32.972325 master-0 kubenswrapper[7642]: I0319 12:09:32.972264 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:33.972676 master-0 kubenswrapper[7642]: I0319 12:09:33.972579 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:33.972676 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:33.972676 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:33.972676 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:33.972676 master-0 kubenswrapper[7642]: I0319 12:09:33.972680 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:34.970492 master-0 kubenswrapper[7642]: I0319 12:09:34.970441 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:34.970492 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:34.970492 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:34.970492 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:34.970841 master-0 kubenswrapper[7642]: I0319 12:09:34.970504 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:35.138533 master-0 kubenswrapper[7642]: I0319 12:09:35.138367 7642 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:09:35.972005 master-0 kubenswrapper[7642]: I0319 12:09:35.971924 7642 patch_prober.go:28] interesting pod/router-default-7dcf5569b5-47tqf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 12:09:35.972005 master-0 kubenswrapper[7642]: [-]has-synced failed: reason withheld Mar 19 12:09:35.972005 master-0 kubenswrapper[7642]: [+]process-running ok Mar 19 12:09:35.972005 master-0 kubenswrapper[7642]: healthz check failed Mar 19 12:09:35.972469 master-0 kubenswrapper[7642]: I0319 12:09:35.972018 7642 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:09:35.972469 master-0 kubenswrapper[7642]: I0319 12:09:35.972090 7642 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:09:35.973036 master-0 kubenswrapper[7642]: I0319 12:09:35.972975 7642 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30"} pod="openshift-ingress/router-default-7dcf5569b5-47tqf" containerMessage="Container router failed startup probe, will be restarted" Mar 19 12:09:35.973136 master-0 kubenswrapper[7642]: I0319 12:09:35.973050 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" podUID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerName="router" containerID="cri-o://302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30" gracePeriod=3600 Mar 19 12:09:39.068106 master-0 kubenswrapper[7642]: I0319 12:09:39.068019 7642 scope.go:117] "RemoveContainer" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" Mar 19 12:09:39.068782 master-0 kubenswrapper[7642]: E0319 12:09:39.068573 7642 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-66b84d69b-59ptg_openshift-ingress-operator(f38df702-a256-4f83-90bb-0c0e477a2ce6)\"" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" podUID="f38df702-a256-4f83-90bb-0c0e477a2ce6" Mar 19 12:09:43.064490 master-0 kubenswrapper[7642]: I0319 12:09:43.064431 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:09:43.065138 master-0 kubenswrapper[7642]: E0319 12:09:43.064733 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db92adf4-f1af-4835-a75c-5e8f63f88883" containerName="installer" Mar 19 12:09:43.065138 master-0 kubenswrapper[7642]: I0319 12:09:43.064745 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="db92adf4-f1af-4835-a75c-5e8f63f88883" containerName="installer" Mar 19 12:09:43.065138 master-0 kubenswrapper[7642]: I0319 12:09:43.064868 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="db92adf4-f1af-4835-a75c-5e8f63f88883" containerName="installer" Mar 19 12:09:43.065384 master-0 kubenswrapper[7642]: I0319 12:09:43.065280 7642 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 12:09:43.065476 master-0 kubenswrapper[7642]: I0319 12:09:43.065414 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.065664 master-0 kubenswrapper[7642]: I0319 12:09:43.065599 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" containerID="cri-o://400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" gracePeriod=15 Mar 19 12:09:43.065746 master-0 kubenswrapper[7642]: I0319 12:09:43.065698 7642 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" gracePeriod=15 Mar 19 12:09:43.066852 master-0 kubenswrapper[7642]: I0319 12:09:43.066581 7642 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:09:43.066998 master-0 kubenswrapper[7642]: E0319 12:09:43.066967 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 12:09:43.066998 master-0 kubenswrapper[7642]: I0319 12:09:43.066984 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 12:09:43.066998 master-0 kubenswrapper[7642]: E0319 12:09:43.066995 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 12:09:43.067154 master-0 kubenswrapper[7642]: I0319 12:09:43.067001 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 12:09:43.067154 master-0 kubenswrapper[7642]: E0319 12:09:43.067034 7642 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 12:09:43.067154 master-0 kubenswrapper[7642]: I0319 12:09:43.067040 7642 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 12:09:43.067154 master-0 kubenswrapper[7642]: I0319 12:09:43.067157 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 12:09:43.067308 master-0 kubenswrapper[7642]: I0319 12:09:43.067172 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 12:09:43.067308 master-0 kubenswrapper[7642]: I0319 12:09:43.067184 7642 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 12:09:43.069333 master-0 kubenswrapper[7642]: I0319 12:09:43.069288 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.115185 master-0 kubenswrapper[7642]: E0319 12:09:43.115100 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.133446 master-0 kubenswrapper[7642]: E0319 12:09:43.133394 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.191957 master-0 kubenswrapper[7642]: I0319 12:09:43.191897 7642 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" exitCode=0 Mar 19 12:09:43.262527 master-0 kubenswrapper[7642]: I0319 12:09:43.262439 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.262799 master-0 kubenswrapper[7642]: I0319 12:09:43.262675 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.262881 master-0 kubenswrapper[7642]: I0319 12:09:43.262804 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.263052 master-0 kubenswrapper[7642]: I0319 12:09:43.263003 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.263170 master-0 kubenswrapper[7642]: I0319 12:09:43.263064 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.263248 master-0 kubenswrapper[7642]: I0319 12:09:43.263169 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.263317 master-0 kubenswrapper[7642]: I0319 12:09:43.263266 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.263389 master-0 kubenswrapper[7642]: I0319 12:09:43.263350 7642 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.365234 master-0 kubenswrapper[7642]: I0319 12:09:43.364955 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365234 master-0 kubenswrapper[7642]: I0319 12:09:43.365192 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.365234 master-0 kubenswrapper[7642]: I0319 12:09:43.365091 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365266 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365303 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365324 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365398 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365450 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365473 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365524 master-0 kubenswrapper[7642]: I0319 12:09:43.365505 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365897 master-0 kubenswrapper[7642]: I0319 12:09:43.365544 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365897 master-0 kubenswrapper[7642]: I0319 12:09:43.365547 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.365897 master-0 kubenswrapper[7642]: I0319 12:09:43.365601 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.365897 master-0 kubenswrapper[7642]: I0319 12:09:43.365662 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.365897 master-0 kubenswrapper[7642]: I0319 12:09:43.365704 7642 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.365897 master-0 kubenswrapper[7642]: I0319 12:09:43.365742 7642 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.416256 master-0 kubenswrapper[7642]: I0319 12:09:43.416070 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:43.434208 master-0 kubenswrapper[7642]: I0319 12:09:43.434141 7642 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:43.471798 master-0 kubenswrapper[7642]: W0319 12:09:43.471711 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7a82869988463543d3d8dd1f0b5fe3.slice/crio-8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403 WatchSource:0}: Error finding container 8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403: Status 404 returned error can't find the container with id 8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403 Mar 19 12:09:43.479296 master-0 kubenswrapper[7642]: E0319 12:09:43.479075 7642 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189e3cd3c561d956 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:8e7a82869988463543d3d8dd1f0b5fe3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c5ce3d1134d6500e2b8528516c1889d7bbc6259aba4981c6983395b0e9eeff65\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:09:43.47784431 +0000 UTC m=+941.595285097,LastTimestamp:2026-03-19 12:09:43.47784431 +0000 UTC m=+941.595285097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 19 12:09:43.483392 master-0 kubenswrapper[7642]: W0319 12:09:43.483342 7642 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45ea2ef1cf2bc9d1d994d6538ae0a64.slice/crio-903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee WatchSource:0}: Error finding container 903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee: Status 404 returned error can't find the container with id 903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee Mar 19 12:09:44.199209 master-0 kubenswrapper[7642]: I0319 12:09:44.199134 7642 generic.go:334] "Generic (PLEG): container finished" podID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" containerID="bd4cbdd654e294cf12e1078774a4064deac39d97874d1f397bebdf9625d6fe36" exitCode=0 Mar 19 12:09:44.200162 master-0 kubenswrapper[7642]: I0319 12:09:44.199203 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"838a469d-2c27-46db-9d6f-17bfa1c8dd09","Type":"ContainerDied","Data":"bd4cbdd654e294cf12e1078774a4064deac39d97874d1f397bebdf9625d6fe36"} Mar 19 12:09:44.200354 master-0 kubenswrapper[7642]: I0319 12:09:44.200267 7642 status_manager.go:851] "Failed to get status for pod" podUID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:09:44.201187 master-0 kubenswrapper[7642]: I0319 12:09:44.201112 7642 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363" exitCode=0 Mar 19 12:09:44.201350 master-0 kubenswrapper[7642]: I0319 12:09:44.201208 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363"} Mar 19 12:09:44.201350 master-0 kubenswrapper[7642]: I0319 12:09:44.201243 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee"} Mar 19 12:09:44.202433 master-0 kubenswrapper[7642]: I0319 12:09:44.202332 7642 status_manager.go:851] "Failed to get status for pod" podUID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:09:44.202433 master-0 kubenswrapper[7642]: E0319 12:09:44.202349 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:09:44.202745 master-0 kubenswrapper[7642]: I0319 12:09:44.202711 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595"} Mar 19 12:09:44.202856 master-0 kubenswrapper[7642]: I0319 12:09:44.202737 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403"} Mar 19 12:09:44.204315 master-0 kubenswrapper[7642]: E0319 12:09:44.203597 7642 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:09:44.204315 master-0 kubenswrapper[7642]: I0319 12:09:44.204233 7642 status_manager.go:851] "Failed to get status for pod" podUID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:09:45.270794 master-0 kubenswrapper[7642]: I0319 12:09:45.268904 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a"} Mar 19 12:09:45.271383 master-0 kubenswrapper[7642]: I0319 12:09:45.270818 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071"} Mar 19 12:09:45.271383 master-0 kubenswrapper[7642]: I0319 12:09:45.270847 7642 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd"} Mar 19 12:09:45.611512 master-0 kubenswrapper[7642]: I0319 12:09:45.609038 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:09:45.813661 master-0 kubenswrapper[7642]: I0319 12:09:45.811151 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kube-api-access\") pod \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " Mar 19 12:09:45.813661 master-0 kubenswrapper[7642]: I0319 12:09:45.811215 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-var-lock\") pod \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " Mar 19 12:09:45.813661 master-0 kubenswrapper[7642]: I0319 12:09:45.811246 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kubelet-dir\") pod \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\" (UID: \"838a469d-2c27-46db-9d6f-17bfa1c8dd09\") " Mar 19 12:09:45.813661 master-0 kubenswrapper[7642]: I0319 12:09:45.811566 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "838a469d-2c27-46db-9d6f-17bfa1c8dd09" (UID: "838a469d-2c27-46db-9d6f-17bfa1c8dd09"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:45.813661 master-0 kubenswrapper[7642]: I0319 12:09:45.811600 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-var-lock" (OuterVolumeSpecName: "var-lock") pod "838a469d-2c27-46db-9d6f-17bfa1c8dd09" (UID: "838a469d-2c27-46db-9d6f-17bfa1c8dd09"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:45.828735 master-0 kubenswrapper[7642]: I0319 12:09:45.825823 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "838a469d-2c27-46db-9d6f-17bfa1c8dd09" (UID: "838a469d-2c27-46db-9d6f-17bfa1c8dd09"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:09:45.917666 master-0 kubenswrapper[7642]: I0319 12:09:45.912468 7642 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:45.917666 master-0 kubenswrapper[7642]: I0319 12:09:45.912510 7642 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:45.917666 master-0 kubenswrapper[7642]: I0319 12:09:45.912524 7642 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/838a469d-2c27-46db-9d6f-17bfa1c8dd09-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:46.060450 master-0 kubenswrapper[7642]: I0319 12:09:46.060383 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 12:09:46.092564 master-0 kubenswrapper[7642]: I0319 12:09:46.092425 7642 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.217674 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.217733 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.217773 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.217807 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.217831 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.217854 7642 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") pod \"49fac1b46a11e49501805e891baae4a9\" (UID: \"49fac1b46a11e49501805e891baae4a9\") " Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.218267 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.218323 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config" (OuterVolumeSpecName: "config") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.218338 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets" (OuterVolumeSpecName: "secrets") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.218354 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.218353 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs" (OuterVolumeSpecName: "logs") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:46.218419 master-0 kubenswrapper[7642]: I0319 12:09:46.218372 7642 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "49fac1b46a11e49501805e891baae4a9" (UID: "49fac1b46a11e49501805e891baae4a9"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:09:46.284873 master-0 kubenswrapper[7642]: I0319 12:09:46.284787 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:09:46.309732 master-0 kubenswrapper[7642]: I0319 12:09:46.309532 7642 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" exitCode=0 Mar 19 12:09:46.309732 master-0 kubenswrapper[7642]: I0319 12:09:46.309661 7642 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 19 12:09:46.319627 master-0 kubenswrapper[7642]: I0319 12:09:46.319537 7642 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:46.319627 master-0 kubenswrapper[7642]: I0319 12:09:46.319578 7642 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-secrets\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:46.319627 master-0 kubenswrapper[7642]: I0319 12:09:46.319588 7642 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:46.319627 master-0 kubenswrapper[7642]: I0319 12:09:46.319597 7642 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:46.319627 master-0 kubenswrapper[7642]: I0319 12:09:46.319609 7642 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:46.319627 master-0 kubenswrapper[7642]: I0319 12:09:46.319619 7642 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/49fac1b46a11e49501805e891baae4a9-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:09:51.980653 master-0 kubenswrapper[7642]: I0319 12:09:51.980573 7642 request.go:700] Waited for 1.019875645s, retries: 1, retry-after: 5s - retry-reason: 503 - request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ovn-kubernetes/secrets?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dovn-control-plane-metrics-cert&resourceVersion=14058&timeout=58m49s&timeoutSeconds=3529&watch=true Mar 19 12:09:52.366661 master-0 kubenswrapper[7642]: I0319 12:09:52.363867 7642 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 12:09:52.373660 master-0 kubenswrapper[7642]: I0319 12:09:52.372730 7642 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="76d14e466053de21cd4b44b01526664874f27eff3ad2faef8933296d68c35040" exitCode=255 Mar 19 12:09:52.569178 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 19 12:09:52.584463 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 19 12:09:52.584794 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 19 12:09:52.585839 master-0 systemd[1]: kubelet.service: Consumed 2min 19.230s CPU time. Mar 19 12:09:52.618434 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 19 12:09:52.736850 master-0 kubenswrapper[29612]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 12:09:52.736850 master-0 kubenswrapper[29612]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 12:09:52.736850 master-0 kubenswrapper[29612]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 12:09:52.737366 master-0 kubenswrapper[29612]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 12:09:52.737366 master-0 kubenswrapper[29612]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 12:09:52.737366 master-0 kubenswrapper[29612]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 12:09:52.737366 master-0 kubenswrapper[29612]: I0319 12:09:52.736988 29612 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740146 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740168 29612 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740174 29612 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740178 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740182 29612 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740187 29612 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740190 29612 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740196 29612 feature_gate.go:330] unrecognized feature gate: Example Mar 19 12:09:52.740182 master-0 kubenswrapper[29612]: W0319 12:09:52.740200 29612 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740205 29612 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740209 29612 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740213 29612 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740217 29612 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740220 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740224 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740228 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740231 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740235 29612 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740239 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740242 29612 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740250 29612 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740254 29612 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740259 29612 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740264 29612 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740269 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740274 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740278 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 12:09:52.740499 master-0 kubenswrapper[29612]: W0319 12:09:52.740283 29612 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740287 29612 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740291 29612 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740294 29612 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740298 29612 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740301 29612 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740305 29612 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740309 29612 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740313 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740317 29612 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740321 29612 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740325 29612 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740329 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740333 29612 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740336 29612 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740340 29612 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740343 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740347 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740352 29612 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 12:09:52.741055 master-0 kubenswrapper[29612]: W0319 12:09:52.740357 29612 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740361 29612 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740365 29612 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740368 29612 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740372 29612 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740376 29612 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740379 29612 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740383 29612 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740387 29612 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740391 29612 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740394 29612 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740399 29612 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740402 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740406 29612 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740409 29612 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740413 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740416 29612 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740420 29612 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740424 29612 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740427 29612 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 12:09:52.741568 master-0 kubenswrapper[29612]: W0319 12:09:52.740431 29612 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: W0319 12:09:52.740434 29612 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: W0319 12:09:52.740438 29612 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: W0319 12:09:52.740442 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: W0319 12:09:52.740446 29612 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: W0319 12:09:52.740450 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740539 29612 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740550 29612 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740577 29612 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740584 29612 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740589 29612 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740594 29612 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740600 29612 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740606 29612 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740610 29612 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740614 29612 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740620 29612 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740625 29612 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740650 29612 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740655 29612 flags.go:64] FLAG: --cgroup-root="" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740660 29612 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740664 29612 flags.go:64] FLAG: --client-ca-file="" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740668 29612 flags.go:64] FLAG: --cloud-config="" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740673 29612 flags.go:64] FLAG: --cloud-provider="" Mar 19 12:09:52.742121 master-0 kubenswrapper[29612]: I0319 12:09:52.740678 29612 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740683 29612 flags.go:64] FLAG: --cluster-domain="" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740688 29612 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740692 29612 flags.go:64] FLAG: --config-dir="" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740696 29612 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740701 29612 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740706 29612 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740711 29612 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740715 29612 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740719 29612 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740723 29612 flags.go:64] FLAG: --contention-profiling="false" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740728 29612 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740733 29612 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740738 29612 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740742 29612 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740747 29612 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740751 29612 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740755 29612 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740760 29612 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740764 29612 flags.go:64] FLAG: --enable-server="true" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740768 29612 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740775 29612 flags.go:64] FLAG: --event-burst="100" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740780 29612 flags.go:64] FLAG: --event-qps="50" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740783 29612 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740788 29612 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 12:09:52.742745 master-0 kubenswrapper[29612]: I0319 12:09:52.740792 29612 flags.go:64] FLAG: --eviction-hard="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740797 29612 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740802 29612 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740806 29612 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740810 29612 flags.go:64] FLAG: --eviction-soft="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740814 29612 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740834 29612 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740839 29612 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740844 29612 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740848 29612 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740852 29612 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740867 29612 flags.go:64] FLAG: --feature-gates="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740879 29612 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740884 29612 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740889 29612 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740894 29612 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740899 29612 flags.go:64] FLAG: --healthz-port="10248" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740904 29612 flags.go:64] FLAG: --help="false" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740916 29612 flags.go:64] FLAG: --hostname-override="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740921 29612 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740927 29612 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740933 29612 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740939 29612 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740944 29612 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740950 29612 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 12:09:52.743585 master-0 kubenswrapper[29612]: I0319 12:09:52.740955 29612 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740961 29612 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740966 29612 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740972 29612 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740978 29612 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740983 29612 flags.go:64] FLAG: --kube-reserved="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740989 29612 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740994 29612 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.740998 29612 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741003 29612 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741008 29612 flags.go:64] FLAG: --lock-file="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741013 29612 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741017 29612 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741022 29612 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741029 29612 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741033 29612 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741038 29612 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741042 29612 flags.go:64] FLAG: --logging-format="text" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741046 29612 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741051 29612 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741056 29612 flags.go:64] FLAG: --manifest-url="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741061 29612 flags.go:64] FLAG: --manifest-url-header="" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741069 29612 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741074 29612 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741080 29612 flags.go:64] FLAG: --max-pods="110" Mar 19 12:09:52.744368 master-0 kubenswrapper[29612]: I0319 12:09:52.741087 29612 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741092 29612 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741097 29612 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741101 29612 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741105 29612 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741110 29612 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741114 29612 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741125 29612 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741130 29612 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741134 29612 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741139 29612 flags.go:64] FLAG: --pod-cidr="" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741143 29612 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:53d66d524ca3e787d8dbe30dbc4d9b8612c9cebd505ccb4375a8441814e85422" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741150 29612 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741155 29612 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741159 29612 flags.go:64] FLAG: --pods-per-core="0" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741164 29612 flags.go:64] FLAG: --port="10250" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741168 29612 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741172 29612 flags.go:64] FLAG: --provider-id="" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741176 29612 flags.go:64] FLAG: --qos-reserved="" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741181 29612 flags.go:64] FLAG: --read-only-port="10255" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741185 29612 flags.go:64] FLAG: --register-node="true" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741189 29612 flags.go:64] FLAG: --register-schedulable="true" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741193 29612 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 12:09:52.745159 master-0 kubenswrapper[29612]: I0319 12:09:52.741208 29612 flags.go:64] FLAG: --registry-burst="10" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741212 29612 flags.go:64] FLAG: --registry-qps="5" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741216 29612 flags.go:64] FLAG: --reserved-cpus="" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741221 29612 flags.go:64] FLAG: --reserved-memory="" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741227 29612 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741231 29612 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741236 29612 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741241 29612 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741245 29612 flags.go:64] FLAG: --runonce="false" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741249 29612 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741255 29612 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741259 29612 flags.go:64] FLAG: --seccomp-default="false" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741264 29612 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741268 29612 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741272 29612 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741276 29612 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741281 29612 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741285 29612 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741289 29612 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741294 29612 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741297 29612 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741302 29612 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741306 29612 flags.go:64] FLAG: --system-cgroups="" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741310 29612 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741317 29612 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 12:09:52.746322 master-0 kubenswrapper[29612]: I0319 12:09:52.741321 29612 flags.go:64] FLAG: --tls-cert-file="" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741325 29612 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741330 29612 flags.go:64] FLAG: --tls-min-version="" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741335 29612 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741339 29612 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741343 29612 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741347 29612 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741351 29612 flags.go:64] FLAG: --v="2" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741357 29612 flags.go:64] FLAG: --version="false" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741363 29612 flags.go:64] FLAG: --vmodule="" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741368 29612 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: I0319 12:09:52.741372 29612 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.743960 29612 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.743993 29612 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.743998 29612 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744003 29612 feature_gate.go:330] unrecognized feature gate: Example Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744008 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744012 29612 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744016 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744022 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744026 29612 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744030 29612 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744034 29612 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 12:09:52.747061 master-0 kubenswrapper[29612]: W0319 12:09:52.744037 29612 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744041 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744044 29612 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744048 29612 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744052 29612 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744055 29612 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744062 29612 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744070 29612 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744074 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744078 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744082 29612 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744085 29612 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744089 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744093 29612 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744097 29612 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744101 29612 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744106 29612 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744110 29612 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744114 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 12:09:52.747908 master-0 kubenswrapper[29612]: W0319 12:09:52.744118 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744124 29612 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744128 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744133 29612 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744137 29612 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744141 29612 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744145 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744149 29612 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744153 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744156 29612 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744160 29612 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744164 29612 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744168 29612 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744171 29612 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744175 29612 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744178 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744183 29612 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744188 29612 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744192 29612 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 12:09:52.748710 master-0 kubenswrapper[29612]: W0319 12:09:52.744196 29612 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744200 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744205 29612 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744208 29612 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744212 29612 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744215 29612 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744219 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744223 29612 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744226 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744231 29612 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744235 29612 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744239 29612 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744242 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744247 29612 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744251 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744254 29612 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744258 29612 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744262 29612 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744266 29612 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744269 29612 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744273 29612 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 12:09:52.749394 master-0 kubenswrapper[29612]: W0319 12:09:52.744276 29612 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.744280 29612 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: I0319 12:09:52.744287 29612 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: I0319 12:09:52.748711 29612 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: I0319 12:09:52.748735 29612 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748797 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748804 29612 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748810 29612 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748815 29612 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748819 29612 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748860 29612 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748866 29612 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748872 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748877 29612 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748882 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 12:09:52.750218 master-0 kubenswrapper[29612]: W0319 12:09:52.748885 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748890 29612 feature_gate.go:330] unrecognized feature gate: Example Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748893 29612 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748897 29612 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748901 29612 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748904 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748908 29612 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748912 29612 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748918 29612 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748924 29612 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748928 29612 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748932 29612 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748936 29612 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748941 29612 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748946 29612 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748950 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748954 29612 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748958 29612 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748962 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 12:09:52.750728 master-0 kubenswrapper[29612]: W0319 12:09:52.748966 29612 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748970 29612 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748975 29612 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748979 29612 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748983 29612 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748987 29612 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748991 29612 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.748996 29612 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749000 29612 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749004 29612 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749007 29612 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749011 29612 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749015 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749019 29612 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749023 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749028 29612 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749032 29612 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749037 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749041 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749044 29612 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 12:09:52.751370 master-0 kubenswrapper[29612]: W0319 12:09:52.749048 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749052 29612 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749056 29612 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749074 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749079 29612 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749083 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749088 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749092 29612 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749096 29612 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749100 29612 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749104 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749108 29612 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749112 29612 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749116 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749120 29612 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749124 29612 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749128 29612 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749132 29612 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749135 29612 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749139 29612 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 12:09:52.751883 master-0 kubenswrapper[29612]: W0319 12:09:52.749143 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749147 29612 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749152 29612 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: I0319 12:09:52.749160 29612 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749281 29612 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749288 29612 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749293 29612 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749297 29612 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749302 29612 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749307 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749311 29612 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749315 29612 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749320 29612 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749323 29612 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 12:09:52.752465 master-0 kubenswrapper[29612]: W0319 12:09:52.749327 29612 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749331 29612 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749335 29612 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749340 29612 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749345 29612 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749349 29612 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749352 29612 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749357 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749361 29612 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749365 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749369 29612 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749373 29612 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749377 29612 feature_gate.go:330] unrecognized feature gate: Example Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749381 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749385 29612 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749389 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749393 29612 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749397 29612 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749402 29612 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749406 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 12:09:52.752854 master-0 kubenswrapper[29612]: W0319 12:09:52.749410 29612 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749415 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749419 29612 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749423 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749427 29612 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749432 29612 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749436 29612 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749441 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749444 29612 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749448 29612 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749452 29612 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749456 29612 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749460 29612 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749464 29612 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749468 29612 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749472 29612 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749476 29612 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749479 29612 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749483 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749487 29612 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 12:09:52.753363 master-0 kubenswrapper[29612]: W0319 12:09:52.749491 29612 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749496 29612 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749500 29612 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749504 29612 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749509 29612 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749513 29612 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749517 29612 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749520 29612 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749524 29612 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749528 29612 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749532 29612 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749536 29612 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749540 29612 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749543 29612 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749547 29612 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749551 29612 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749555 29612 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749560 29612 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749564 29612 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 12:09:52.753866 master-0 kubenswrapper[29612]: W0319 12:09:52.749569 29612 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: W0319 12:09:52.749573 29612 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: W0319 12:09:52.749577 29612 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.749583 29612 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.749746 29612 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.751339 29612 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.751409 29612 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.751595 29612 server.go:997] "Starting client certificate rotation" Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.751607 29612 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.751784 29612 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 08:37:06.197908128 +0000 UTC Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.751831 29612 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h27m13.44607938s for next certificate rotation Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.752282 29612 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 12:09:52.755220 master-0 kubenswrapper[29612]: I0319 12:09:52.753626 29612 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 12:09:52.756063 master-0 kubenswrapper[29612]: I0319 12:09:52.755569 29612 log.go:25] "Validated CRI v1 runtime API" Mar 19 12:09:52.759102 master-0 kubenswrapper[29612]: I0319 12:09:52.759073 29612 log.go:25] "Validated CRI v1 image API" Mar 19 12:09:52.762026 master-0 kubenswrapper[29612]: I0319 12:09:52.760824 29612 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 12:09:52.771490 master-0 kubenswrapper[29612]: I0319 12:09:52.771405 29612 fs.go:135] Filesystem UUIDs: map[2be6375c-780c-4b64-a018-3b2c170f9318:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 19 12:09:52.772879 master-0 kubenswrapper[29612]: I0319 12:09:52.771466 29612 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0c58fd5ea7dfc487f516ab9f02025ffdb09d90a9ab5d62ff3f2b99a062549935/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0c58fd5ea7dfc487f516ab9f02025ffdb09d90a9ab5d62ff3f2b99a062549935/userdata/shm major:0 minor:851 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea/userdata/shm major:0 minor:937 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6/userdata/shm major:0 minor:648 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166/userdata/shm major:0 minor:100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27bfc2e7edbcc6c9da4d8e8c7defa74d634ff9325db83c99710558275c0105da/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27bfc2e7edbcc6c9da4d8e8c7defa74d634ff9325db83c99710558275c0105da/userdata/shm major:0 minor:583 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/294f59c88ee9d0ef672e140bfaeda4c298629a073fa82f34bad6f87890b02c82/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/294f59c88ee9d0ef672e140bfaeda4c298629a073fa82f34bad6f87890b02c82/userdata/shm major:0 minor:788 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ccf8a3de86952995f531b01f4ffdc152c7bc1633eadc15477217fefaa028047/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ccf8a3de86952995f531b01f4ffdc152c7bc1633eadc15477217fefaa028047/userdata/shm major:0 minor:848 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c/userdata/shm major:0 minor:1075 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba/userdata/shm major:0 minor:411 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3/userdata/shm major:0 minor:1029 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/414a48d3cf2977fd972a56aeae2697291bc4403c105ce776bfadef125a18facb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/414a48d3cf2977fd972a56aeae2697291bc4403c105ce776bfadef125a18facb/userdata/shm major:0 minor:846 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d/userdata/shm major:0 minor:1027 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e/userdata/shm major:0 minor:280 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35/userdata/shm major:0 minor:1164 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f20fa38da6728fdc70e1b66426e311b675e090aa8e87afaac1c37a5210cd1c8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f20fa38da6728fdc70e1b66426e311b675e090aa8e87afaac1c37a5210cd1c8/userdata/shm major:0 minor:816 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/514e629483d9ce3a471489bf8ebd7e39de0e8072f7a78d8b93a7ef5b4897dfbc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/514e629483d9ce3a471489bf8ebd7e39de0e8072f7a78d8b93a7ef5b4897dfbc/userdata/shm major:0 minor:544 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5bea1d036c9faad13c1d14003487235e6d8ac60afd384e78842d83618960ce00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5bea1d036c9faad13c1d14003487235e6d8ac60afd384e78842d83618960ce00/userdata/shm major:0 minor:529 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924/userdata/shm major:0 minor:1123 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327/userdata/shm major:0 minor:952 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6b5887fbc5e32ebc2f94544f4a59e3db7e9426484412d0d7e1662fd8761f75d2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6b5887fbc5e32ebc2f94544f4a59e3db7e9426484412d0d7e1662fd8761f75d2/userdata/shm major:0 minor:371 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3/userdata/shm major:0 minor:465 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855/userdata/shm major:0 minor:368 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8/userdata/shm major:0 minor:240 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab/userdata/shm major:0 minor:487 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3/userdata/shm major:0 minor:519 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43/userdata/shm major:0 minor:250 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f/userdata/shm major:0 minor:860 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/854a387c5ef79581e793cea9af478d1025e855bc5db72359436a67d2aec956ce/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/854a387c5ef79581e793cea9af478d1025e855bc5db72359436a67d2aec956ce/userdata/shm major:0 minor:850 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1/userdata/shm major:0 minor:471 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403/userdata/shm major:0 minor:84 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee/userdata/shm major:0 minor:94 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e/userdata/shm major:0 minor:243 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f/userdata/shm major:0 minor:587 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9dd3eeed986a63782457294b98f428fdf5ac5bd98b40cab991676824cbd680b6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9dd3eeed986a63782457294b98f428fdf5ac5bd98b40cab991676824cbd680b6/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94/userdata/shm major:0 minor:1105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084/userdata/shm major:0 minor:1063 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a297c2fb1d857d07bfeb6ac6ca57bb65e15ebd85864af6b5b40028961499f08b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a297c2fb1d857d07bfeb6ac6ca57bb65e15ebd85864af6b5b40028961499f08b/userdata/shm major:0 minor:589 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a77515f16c17046f407c9f07e5609964e9814a68e8f45a75bb351406b4dcfc20/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a77515f16c17046f407c9f07e5609964e9814a68e8f45a75bb351406b4dcfc20/userdata/shm major:0 minor:705 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442/userdata/shm major:0 minor:857 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b/userdata/shm major:0 minor:854 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b1eb508b26c0e3a04c674bb8b166ab63a387c4e22fe8322ed85b0094f61f22e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b1eb508b26c0e3a04c674bb8b166ab63a387c4e22fe8322ed85b0094f61f22e8/userdata/shm major:0 minor:438 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b8b47e7cae81e2e8fc5d1fdd94e1150be80c99269a023622d4d29934167c52d2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b8b47e7cae81e2e8fc5d1fdd94e1150be80c99269a023622d4d29934167c52d2/userdata/shm major:0 minor:475 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c30aed6efb2687e8c9c0787c5b85c3ecc5ac073733a122b5233bd897825e7a16/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c30aed6efb2687e8c9c0787c5b85c3ecc5ac073733a122b5233bd897825e7a16/userdata/shm major:0 minor:1100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f/userdata/shm major:0 minor:480 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241/userdata/shm major:0 minor:679 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a/userdata/shm major:0 minor:469 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217/userdata/shm major:0 minor:448 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41/userdata/shm major:0 minor:255 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d318202a66572196e0681affd0ab554e8fa6ff748647c2d4a3aedba6fee2ec96/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d318202a66572196e0681affd0ab554e8fa6ff748647c2d4a3aedba6fee2ec96/userdata/shm major:0 minor:770 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d34c10b1c43979eaba12a772c86e5b04cdb550f06f5159184d7c1b781b1802f8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d34c10b1c43979eaba12a772c86e5b04cdb550f06f5159184d7c1b781b1802f8/userdata/shm major:0 minor:840 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8/userdata/shm major:0 minor:476 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/da05a78204256ccd4edc262bb9b930e6804a73e9ac544e9244d9c5d71173e2a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/da05a78204256ccd4edc262bb9b930e6804a73e9ac544e9244d9c5d71173e2a9/userdata/shm major:0 minor:435 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5/userdata/shm major:0 minor:1021 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6/userdata/shm major:0 minor:1025 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780/userdata/shm major:0 minor:474 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c/userdata/shm major:0 minor:464 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ef597ecc3cb239a3f4a10a1a4b99299b44a320ef76f5f894771c1233b3937cda/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ef597ecc3cb239a3f4a10a1a4b99299b44a320ef76f5f894771c1233b3937cda/userdata/shm major:0 minor:309 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f233927f2993af718960afb2ae363b672383e871d2174bdf0f0ce3e81e24474f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f233927f2993af718960afb2ae363b672383e871d2174bdf0f0ce3e81e24474f/userdata/shm major:0 minor:853 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc/userdata/shm major:0 minor:264 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af/userdata/shm major:0 minor:479 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fed06e56f643611f5a7965ed4f8766bb3618f49ec311e6c29d1c5641de5af6fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fed06e56f643611f5a7965ed4f8766bb3618f49ec311e6c29d1c5641de5af6fe/userdata/shm major:0 minor:689 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/003e840c-1b07-4624-8225-e4ffdab74213/volumes/kubernetes.io~projected/kube-api-access-hmbbm:{mountpoint:/var/lib/kubelet/pods/003e840c-1b07-4624-8225-e4ffdab74213/volumes/kubernetes.io~projected/kube-api-access-hmbbm major:0 minor:655 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/kube-api-access-bwp24:{mountpoint:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/kube-api-access-bwp24 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:453 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:701 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~empty-dir/tmp major:0 minor:556 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~projected/kube-api-access-6frf6:{mountpoint:/var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~projected/kube-api-access-6frf6 major:0 minor:683 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~projected/kube-api-access-x9hnq:{mountpoint:/var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~projected/kube-api-access-x9hnq major:0 minor:1099 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1092 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1096 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0c804ea0-3483-4506-b138-3bf30016d8d8/volumes/kubernetes.io~projected/kube-api-access-pxjkt:{mountpoint:/var/lib/kubelet/pods/0c804ea0-3483-4506-b138-3bf30016d8d8/volumes/kubernetes.io~projected/kube-api-access-pxjkt major:0 minor:367 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0f2278c1-05cf-469c-a13c-762b7a790b38/volumes/kubernetes.io~projected/kube-api-access-rztqj:{mountpoint:/var/lib/kubelet/pods/0f2278c1-05cf-469c-a13c-762b7a790b38/volumes/kubernetes.io~projected/kube-api-access-rztqj major:0 minor:835 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0f2278c1-05cf-469c-a13c-762b7a790b38/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/0f2278c1-05cf-469c-a13c-762b7a790b38/volumes/kubernetes.io~secret/cert major:0 minor:828 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc/volumes/kubernetes.io~projected/kube-api-access-9kf8c:{mountpoint:/var/lib/kubelet/pods/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc/volumes/kubernetes.io~projected/kube-api-access-9kf8c major:0 minor:95 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes/kubernetes.io~projected/kube-api-access-nwmkf:{mountpoint:/var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes/kubernetes.io~projected/kube-api-access-nwmkf major:0 minor:528 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes/kubernetes.io~secret/serving-cert major:0 minor:561 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~projected/kube-api-access-4sprd:{mountpoint:/var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~projected/kube-api-access-4sprd major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:461 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~projected/kube-api-access-h95pm:{mountpoint:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~projected/kube-api-access-h95pm major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6/volumes/kubernetes.io~projected/kube-api-access-pxt2h:{mountpoint:/var/lib/kubelet/pods/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6/volumes/kubernetes.io~projected/kube-api-access-pxt2h major:0 minor:402 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6/volumes/kubernetes.io~secret/proxy-tls major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24bd6a9b-9934-4cf9-b591-1ab892c8014c/volumes/kubernetes.io~projected/kube-api-access-jd74j:{mountpoint:/var/lib/kubelet/pods/24bd6a9b-9934-4cf9-b591-1ab892c8014c/volumes/kubernetes.io~projected/kube-api-access-jd74j major:0 minor:434 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~projected/kube-api-access-f7qsg:{mountpoint:/var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~projected/kube-api-access-f7qsg major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~secret/metrics-tls major:0 minor:458 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~projected/kube-api-access-79fmb:{mountpoint:/var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~projected/kube-api-access-79fmb major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:463 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a185e40-e97b-49e9-aea3-5f170401a9e3/volumes/kubernetes.io~projected/kube-api-access-2v4xb:{mountpoint:/var/lib/kubelet/pods/3a185e40-e97b-49e9-aea3-5f170401a9e3/volumes/kubernetes.io~projected/kube-api-access-2v4xb major:0 minor:818 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a185e40-e97b-49e9-aea3-5f170401a9e3/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/3a185e40-e97b-49e9-aea3-5f170401a9e3/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:815 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~projected/kube-api-access-mgr4k:{mountpoint:/var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~projected/kube-api-access-mgr4k major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~secret/metrics-certs major:0 minor:460 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~projected/kube-api-access-8cvpt:{mountpoint:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~projected/kube-api-access-8cvpt major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~secret/serving-cert major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~projected/kube-api-access-wgsm4:{mountpoint:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~projected/kube-api-access-wgsm4 major:0 minor:746 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/default-certificate major:0 minor:744 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/metrics-certs major:0 minor:656 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/stats-auth major:0 minor:737 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~projected/kube-api-access-bssz9:{mountpoint:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~projected/kube-api-access-bssz9 major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~projected/kube-api-access-hpdhv:{mountpoint:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~projected/kube-api-access-hpdhv major:0 minor:1161 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1159 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1160 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1158 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~projected/kube-api-access-69pp5:{mountpoint:/var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~projected/kube-api-access-69pp5 major:0 minor:1104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d625e92-793e-4942-a826-d30d21e3ca0c/volumes/kubernetes.io~projected/kube-api-access-gczvq:{mountpoint:/var/lib/kubelet/pods/4d625e92-793e-4942-a826-d30d21e3ca0c/volumes/kubernetes.io~projected/kube-api-access-gczvq major:0 minor:450 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d790b42-2182-4b9c-8391-285e938715cc/volumes/kubernetes.io~projected/kube-api-access-zh5cs:{mountpoint:/var/lib/kubelet/pods/4d790b42-2182-4b9c-8391-285e938715cc/volumes/kubernetes.io~projected/kube-api-access-zh5cs major:0 minor:370 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e7b04dc-e85d-41b8-959c-87b1b24731a1/volumes/kubernetes.io~projected/kube-api-access-9h4cz:{mountpoint:/var/lib/kubelet/pods/4e7b04dc-e85d-41b8-959c-87b1b24731a1/volumes/kubernetes.io~projected/kube-api-access-9h4cz major:0 minor:410 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e7b04dc-e85d-41b8-959c-87b1b24731a1/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/4e7b04dc-e85d-41b8-959c-87b1b24731a1/volumes/kubernetes.io~secret/signing-key major:0 minor:405 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~projected/kube-api-access-f7dp9:{mountpoint:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~projected/kube-api-access-f7dp9 major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~projected/kube-api-access-4khlg:{mountpoint:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~projected/kube-api-access-4khlg major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~projected/kube-api-access-j8mds:{mountpoint:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~projected/kube-api-access-j8mds major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57d19c68-b469-4a6f-abe0-3a0eff32639c/volumes/kubernetes.io~projected/kube-api-access-v8qjv:{mountpoint:/var/lib/kubelet/pods/57d19c68-b469-4a6f-abe0-3a0eff32639c/volumes/kubernetes.io~projected/kube-api-access-v8qjv major:0 minor:1020 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57d19c68-b469-4a6f-abe0-3a0eff32639c/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/57d19c68-b469-4a6f-abe0-3a0eff32639c/volumes/kubernetes.io~secret/cert major:0 minor:1015 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~projected/ca-certs major:0 minor:540 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~projected/kube-api-access-gjj2t:{mountpoint:/var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~projected/kube-api-access-gjj2t major:0 minor:542 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:541 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~projected/kube-api-access-s98qv:{mountpoint:/var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~projected/kube-api-access-s98qv major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~secret/srv-cert major:0 minor:406 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~projected/kube-api-access-z7cz7:{mountpoint:/var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~projected/kube-api-access-z7cz7 major:0 minor:1098 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1097 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1117 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/60081e3b-4fe5-42a3-bfae-0b07853c0e36/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/60081e3b-4fe5-42a3-bfae-0b07853c0e36/volumes/kubernetes.io~projected/ca-certs major:0 minor:539 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/60081e3b-4fe5-42a3-bfae-0b07853c0e36/volumes/kubernetes.io~projected/kube-api-access-ctghm:{mountpoint:/var/lib/kubelet/pods/60081e3b-4fe5-42a3-bfae-0b07853c0e36/volumes/kubernetes.io~projected/kube-api-access-ctghm major:0 minor:543 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/63e81c6a-b723-4c9a-b471-226fefa9c7a7/volumes/kubernetes.io~projected/kube-api-access-r97ml:{mountpoint:/var/lib/kubelet/pods/63e81c6a-b723-4c9a-b471-226fefa9c7a7/volumes/kubernetes.io~projected/kube-api-access-r97ml major:0 minor:822 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/63e81c6a-b723-4c9a-b471-226fefa9c7a7/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/63e81c6a-b723-4c9a-b471-226fefa9c7a7/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:820 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64136215-8539-4349-977f-6d59deb132ea/volumes/kubernetes.io~projected/kube-api-access-cgtrh:{mountpoint:/var/lib/kubelet/pods/64136215-8539-4349-977f-6d59deb132ea/volumes/kubernetes.io~projected/kube-api-access-cgtrh major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/65b629a3-ea10-4517-b762-b18b22fab2a9/volumes/kubernetes.io~projected/kube-api-access-q9qkq:{mountpoint:/var/lib/kubelet/pods/65b629a3-ea10-4517-b762-b18b22fab2a9/volumes/kubernetes.io~projected/kube-api-access-q9qkq major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~projected/kube-api-access-8ntj9:{mountpoint:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~projected/kube-api-access-8ntj9 major:0 minor:444 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/encryption-config major:0 minor:440 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/etcd-client major:0 minor:442 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/serving-cert major:0 minor:443 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~projected/kube-api-access-f99wt:{mountpoint:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~projected/kube-api-access-f99wt major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~projected/kube-api-access-pdmj8:{mountpoint:/var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~projected/kube-api-access-pdmj8 major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~secret/srv-cert major:0 minor:433 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~projected/kube-api-access-mm4kn:{mountpoint:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~projected/kube-api-access-mm4kn major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:457 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:456 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~projected/kube-api-access-2m4gg:{mountpoint:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~projected/kube-api-access-2m4gg major:0 minor:495 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/encryption-config major:0 minor:494 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/etcd-client major:0 minor:493 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/serving-cert major:0 minor:492 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8117f6b0-0b9f-4a1d-9807-bf73c19f388b/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/8117f6b0-0b9f-4a1d-9807-bf73c19f388b/volumes/kubernetes.io~secret/tls-certificates major:0 minor:747 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes/kubernetes.io~projected/kube-api-access-64sr2:{mountpoint:/var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes/kubernetes.io~projected/kube-api-access-64sr2 major:0 minor:580 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes/kubernetes.io~secret/serving-cert major:0 minor:562 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/volumes/kubernetes.io~projected/kube-api-access-v5cc5:{mountpoint:/var/lib/kubelet/pods/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/volumes/kubernetes.io~projected/kube-api-access-v5cc5 major:0 minor:805 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/volumes/kubernetes.io Mar 19 12:09:52.773413 master-0 kubenswrapper[29612]: ~secret/machine-approver-tls major:0 minor:804 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~projected/kube-api-access-k9vzl:{mountpoint:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~projected/kube-api-access-k9vzl major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf/volumes/kubernetes.io~projected/kube-api-access-x9g5b:{mountpoint:/var/lib/kubelet/pods/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf/volumes/kubernetes.io~projected/kube-api-access-x9g5b major:0 minor:832 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf/volumes/kubernetes.io~secret/serving-cert major:0 minor:830 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0/volumes/kubernetes.io~projected/kube-api-access-rn94k:{mountpoint:/var/lib/kubelet/pods/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0/volumes/kubernetes.io~projected/kube-api-access-rn94k major:0 minor:936 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0/volumes/kubernetes.io~secret/proxy-tls major:0 minor:682 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~projected/kube-api-access-m6rhj:{mountpoint:/var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~projected/kube-api-access-m6rhj major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:761 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99ffcc9a-4b01-439a-93e0-0674bdfd32ec/volumes/kubernetes.io~projected/kube-api-access-jd875:{mountpoint:/var/lib/kubelet/pods/99ffcc9a-4b01-439a-93e0-0674bdfd32ec/volumes/kubernetes.io~projected/kube-api-access-jd875 major:0 minor:266 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~projected/kube-api-access-k5z4r:{mountpoint:/var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~projected/kube-api-access-k5z4r major:0 minor:1062 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~secret/certs major:0 minor:1054 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1053 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5cf6404-1640-43d3-8018-03c62c8338c5/volumes/kubernetes.io~projected/kube-api-access-b48q2:{mountpoint:/var/lib/kubelet/pods/a5cf6404-1640-43d3-8018-03c62c8338c5/volumes/kubernetes.io~projected/kube-api-access-b48q2 major:0 minor:696 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5cf6404-1640-43d3-8018-03c62c8338c5/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/a5cf6404-1640-43d3-8018-03c62c8338c5/volumes/kubernetes.io~secret/metrics-tls major:0 minor:702 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5d009a3-5ff2-49f9-9cea-004b99729b77/volumes/kubernetes.io~projected/kube-api-access-z4khz:{mountpoint:/var/lib/kubelet/pods/a5d009a3-5ff2-49f9-9cea-004b99729b77/volumes/kubernetes.io~projected/kube-api-access-z4khz major:0 minor:817 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5d009a3-5ff2-49f9-9cea-004b99729b77/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/a5d009a3-5ff2-49f9-9cea-004b99729b77/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:814 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~projected/kube-api-access-scp9p:{mountpoint:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~projected/kube-api-access-scp9p major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~projected/kube-api-access major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~projected/kube-api-access-w4ms4:{mountpoint:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~projected/kube-api-access-w4ms4 major:0 minor:99 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~projected/kube-api-access-8765q:{mountpoint:/var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~projected/kube-api-access-8765q major:0 minor:929 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:924 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~secret/webhook-cert major:0 minor:928 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~projected/kube-api-access-6h6qg:{mountpoint:/var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~projected/kube-api-access-6h6qg major:0 minor:1074 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:1069 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:1073 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bcc25a4a-521a-4139-a10c-d5b8bba6e359/volumes/kubernetes.io~projected/kube-api-access-2vhrb:{mountpoint:/var/lib/kubelet/pods/bcc25a4a-521a-4139-a10c-d5b8bba6e359/volumes/kubernetes.io~projected/kube-api-access-2vhrb major:0 minor:845 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bcc25a4a-521a-4139-a10c-d5b8bba6e359/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/bcc25a4a-521a-4139-a10c-d5b8bba6e359/volumes/kubernetes.io~secret/webhook-certs major:0 minor:842 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~projected/kube-api-access major:0 minor:272 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~secret/serving-cert major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c479beae-6d34-4d49-81c8-39f6a53ff6ca/volumes/kubernetes.io~projected/kube-api-access-xxm2c:{mountpoint:/var/lib/kubelet/pods/c479beae-6d34-4d49-81c8-39f6a53ff6ca/volumes/kubernetes.io~projected/kube-api-access-xxm2c major:0 minor:112 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~projected/kube-api-access-rbwcn:{mountpoint:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~projected/kube-api-access-rbwcn major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/volumes/kubernetes.io~projected/kube-api-access-rvp77:{mountpoint:/var/lib/kubelet/pods/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/volumes/kubernetes.io~projected/kube-api-access-rvp77 major:0 minor:833 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:831 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d979be56-f948-4252-ab6b-9bfc48dc4187/volumes/kubernetes.io~projected/kube-api-access-w7bl2:{mountpoint:/var/lib/kubelet/pods/d979be56-f948-4252-ab6b-9bfc48dc4187/volumes/kubernetes.io~projected/kube-api-access-w7bl2 major:0 minor:749 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e47d3935-d96e-4fd2-abfc-99bff7eac5e0/volumes/kubernetes.io~projected/kube-api-access-t85jj:{mountpoint:/var/lib/kubelet/pods/e47d3935-d96e-4fd2-abfc-99bff7eac5e0/volumes/kubernetes.io~projected/kube-api-access-t85jj major:0 minor:951 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e47d3935-d96e-4fd2-abfc-99bff7eac5e0/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/e47d3935-d96e-4fd2-abfc-99bff7eac5e0/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:844 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~projected/kube-api-access-bptbc:{mountpoint:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~projected/kube-api-access-bptbc major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~projected/kube-api-access-r2knz:{mountpoint:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~projected/kube-api-access-r2knz major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~secret/cert major:0 minor:455 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:432 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eb274dd5-e98e-4485-a050-7bec560a0daa/volumes/kubernetes.io~projected/kube-api-access-jsvpz:{mountpoint:/var/lib/kubelet/pods/eb274dd5-e98e-4485-a050-7bec560a0daa/volumes/kubernetes.io~projected/kube-api-access-jsvpz major:0 minor:827 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/eb274dd5-e98e-4485-a050-7bec560a0daa/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/eb274dd5-e98e-4485-a050-7bec560a0daa/volumes/kubernetes.io~secret/proxy-tls major:0 minor:821 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/kube-api-access-g4qb9:{mountpoint:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/kube-api-access-g4qb9 major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:462 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f5148a50-4332-4ab4-af63-449ef7f16333/volumes/kubernetes.io~projected/kube-api-access-p9zjb:{mountpoint:/var/lib/kubelet/pods/f5148a50-4332-4ab4-af63-449ef7f16333/volumes/kubernetes.io~projected/kube-api-access-p9zjb major:0 minor:490 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f65be54c-d851-4daa-93a0-8a3947da5e26/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/f65be54c-d851-4daa-93a0-8a3947da5e26/volumes/kubernetes.io~projected/kube-api-access major:0 minor:326 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f65be54c-d851-4daa-93a0-8a3947da5e26/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f65be54c-d851-4daa-93a0-8a3947da5e26/volumes/kubernetes.io~secret/serving-cert major:0 minor:325 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f93ee7db-f0ed-470f-a458-86c54e6d064a/volumes/kubernetes.io~projected/kube-api-access-jstrq:{mountpoint:/var/lib/kubelet/pods/f93ee7db-f0ed-470f-a458-86c54e6d064a/volumes/kubernetes.io~projected/kube-api-access-jstrq major:0 minor:1019 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbde1a69-b56c-467e-a291-9a83c448346f/volumes/kubernetes.io~projected/kube-api-access-j5f65:{mountpoint:/var/lib/kubelet/pods/fbde1a69-b56c-467e-a291-9a83c448346f/volumes/kubernetes.io~projected/kube-api-access-j5f65 major:0 minor:834 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbde1a69-b56c-467e-a291-9a83c448346f/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/fbde1a69-b56c-467e-a291-9a83c448346f/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:829 fsType:tmpfs blockSize:0} overlay_0-1000:{mountpoint:/var/lib/containers/storage/overlay/04b9f43da28271b10ec4b1d0c94d4ad00050e432f1b52a64b8b4a63e7ed76f2d/merged major:0 minor:1000 fsType:overlay blockSize:0} overlay_0-1010:{mountpoint:/var/lib/containers/storage/overlay/92348cd935f9fd206a4ac92c5304b93c8123ee9714fc66531164fc438ab9762e/merged major:0 minor:1010 fsType:overlay blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/d917e377248c30f62f7da9ba62c5fc95eb75fd02efc0e9ced1feff4e6935c296/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-1023:{mountpoint:/var/lib/containers/storage/overlay/fe69647b8498ba9d6e749671e0d3cf3e7d1d100b97fa51f0978b40a62bb0487c/merged major:0 minor:1023 fsType:overlay blockSize:0} overlay_0-1031:{mountpoint:/var/lib/containers/storage/overlay/cb3f473d320081d015b36fe3aa075a73af630f26a0fef68c9b8be52e433dbf31/merged major:0 minor:1031 fsType:overlay blockSize:0} overlay_0-1033:{mountpoint:/var/lib/containers/storage/overlay/25fa534c6d4d0bee95d5ccbd0f5227d41f4a8ac9d0bfe22059a79cce68108309/merged major:0 minor:1033 fsType:overlay blockSize:0} overlay_0-1035:{mountpoint:/var/lib/containers/storage/overlay/f575fedc7dd87ab7b7e33fc43f4a8733dd5fcd37de1bd662654edc48bb73df65/merged major:0 minor:1035 fsType:overlay blockSize:0} overlay_0-1036:{mountpoint:/var/lib/containers/storage/overlay/c93d9fdf944add15cc0ca0e12529c5a50c54dd99d3916a5cd0be51c8a80bbb24/merged major:0 minor:1036 fsType:overlay blockSize:0} overlay_0-1038:{mountpoint:/var/lib/containers/storage/overlay/91f318f074fcbbc1a86cd812d1806f3a3d3f831d3de0caac14cc347c8d9ea9b7/merged major:0 minor:1038 fsType:overlay blockSize:0} overlay_0-1044:{mountpoint:/var/lib/containers/storage/overlay/3833fffad07c0f67dbb37e1c314bd9e999547ab535a778f3831d44db4b264611/merged major:0 minor:1044 fsType:overlay blockSize:0} overlay_0-1051:{mountpoint:/var/lib/containers/storage/overlay/879294a68e8812737a4cf1b860befd271ff38e8d2e0dc53d4db19fffa3bae026/merged major:0 minor:1051 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/58c9a980766cae75ec98614bc5da33db1e02e496a0a4c2a5900718b4b9cf5fa0/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-1065:{mountpoint:/var/lib/containers/storage/overlay/f606ad4334fb12546917687bbaf6afde6c406c346b249280a355b07880548120/merged major:0 minor:1065 fsType:overlay blockSize:0} overlay_0-1067:{mountpoint:/var/lib/containers/storage/overlay/83e29b71f7293ad60047d3d4ab324e56f577c4ac609ec8659817885817d268ee/merged major:0 minor:1067 fsType:overlay blockSize:0} overlay_0-1077:{mountpoint:/var/lib/containers/storage/overlay/50b72d3b12c009cc4435764124e7bdff459ba43e32944b543415293499b26dc6/merged major:0 minor:1077 fsType:overlay blockSize:0} overlay_0-1079:{mountpoint:/var/lib/containers/storage/overlay/3118edcd394f77d6ea4b95a431176b5c4df29a74e298181153a1312e20dbd995/merged major:0 minor:1079 fsType:overlay blockSize:0} overlay_0-1081:{mountpoint:/var/lib/containers/storage/overlay/d2597ba07186edda40caa401742f92a158faabe9d9748d29de3af432b7aae48d/merged major:0 minor:1081 fsType:overlay blockSize:0} overlay_0-1086:{mountpoint:/var/lib/containers/storage/overlay/9f16728b0431d9b3db6a3cb2540b1fd0d13993f9895a01fec69bfd7401ad57cb/merged major:0 minor:1086 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/d7bf2c6bdb9caf036748211c2602e34322b65d26334b1622420475fe24e7ce04/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-1107:{mountpoint:/var/lib/containers/storage/overlay/b12daddceff7c5c8aaacd46df1405c0f8ed9da06100e38fe642277b2e005204c/merged major:0 minor:1107 fsType:overlay blockSize:0} overlay_0-1109:{mountpoint:/var/lib/containers/storage/overlay/4dbd748cbe89ec804225a07a51574f5a95ccd6d87bfab911898c10b769496b13/merged major:0 minor:1109 fsType:overlay blockSize:0} overlay_0-1111:{mountpoint:/var/lib/containers/storage/overlay/0ac643adcd72566f10b6f435d8d5667754e136aa10d77c2d1176dc3df0252109/merged major:0 minor:1111 fsType:overlay blockSize:0} overlay_0-1121:{mountpoint:/var/lib/containers/storage/overlay/3b9c49b6db6e1a68ebdc86a98d6be23542661d7bbdbb3409cac0cc808fe7fc12/merged major:0 minor:1121 fsType:overlay blockSize:0} overlay_0-1125:{mountpoint:/var/lib/containers/storage/overlay/8b9c426de56ee1247a3af3e22cc4834c19d54256e40500412088638ec66d2d40/merged major:0 minor:1125 fsType:overlay blockSize:0} overlay_0-1127:{mountpoint:/var/lib/containers/storage/overlay/998d2fac59462bdac34934a8ee9ba73868a58ab599428a081e28310bc7237597/merged major:0 minor:1127 fsType:overlay blockSize:0} overlay_0-1129:{mountpoint:/var/lib/containers/storage/overlay/9a58406ea0e5316cc534d5a7a47d94cff395e7e5dc15d95e5f7ed92f716e859c/merged major:0 minor:1129 fsType:overlay blockSize:0} overlay_0-1131:{mountpoint:/var/lib/containers/storage/overlay/f2de0b0f1021e1edefbf81cd14688889f19c0a0981e8ff7148c2d690c0e6bc79/merged major:0 minor:1131 fsType:overlay blockSize:0} overlay_0-1134:{mountpoint:/var/lib/containers/storage/overlay/57ca3983049daee3cae9997f3d658b0b0ec618c0bcc0fcd12242ede22575f217/merged major:0 minor:1134 fsType:overlay blockSize:0} overlay_0-114:{mountpoint:/var/lib/containers/storage/overlay/fd35b566b11da65ad643429215ab8cecf8eca61495521fc99fd9762ae3fbf2b9/merged major:0 minor:114 fsType:overlay blockSize:0} overlay_0-1145:{mountpoint:/var/lib/containers/storage/overlay/f16906a1d2c9cf5e1b1ac8088aac93992671e8c69e786c8867c49635c0ff9bf4/merged major:0 minor:1145 fsType:overlay blockSize:0} overlay_0-1162:{mountpoint:/var/lib/containers/storage/overlay/e83cde721623f2b2867a80dbd712b83dca24eaea9196cc794d86050685c5eb84/merged major:0 minor:1162 fsType:overlay blockSize:0} overlay_0-1173:{mountpoint:/var/lib/containers/storage/overlay/c55cf874c6019b74d02504a5b47dd9f7c144d723a4a0970e0448df97f4e0896d/merged major:0 minor:1173 fsType:overlay blockSize:0} overlay_0-1175:{mountpoint:/var/lib/containers/storage/overlay/fe4566fdf8579a8be63cbe1cad404257b6d3c1f202ebbe067fed4b77b29884f4/merged major:0 minor:1175 fsType:overlay blockSize:0} overlay_0-1177:{mountpoint:/var/lib/containers/storage/overlay/2bcc440489014df6e1a6f59f48dd72d4a2371aa3b4c642685afb80463037706a/merged major:0 minor:1177 fsType:overlay blockSize:0} overlay_0-1186:{mountpoint:/var/lib/containers/storage/overlay/38d0bf8fcaa6d943206e82a28167af55614b8b3614932c37e4340a1b12adf01e/merged major:0 minor:1186 fsType:overlay blockSize:0} overlay_0-1188:{mountpoint:/var/lib/containers/storage/overlay/f172d03bdd1c34896bfa4271eea02ccea708ae8b9acc24755a37da2836ae1fec/merged major:0 minor:1188 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/ae842a80668b607b2b1d8a6d646dba1b2ff61472b4fbceec573c1789ae7746c7/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-132:{mountpoint:/var/lib/containers/storage/overlay/c364db9b5ec2e23a7ca51fd1a45ca28b58092c3f75089af91fc2ea75e25c6865/merged major:0 minor:132 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/6a08985844c171dd83a137ccdcd2445ade44cb0466c3350af2729cbb0d36e5fe/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/4506c3c2463571055478e81b7e23c733eddc2a262491d1b8139b95082423d7cb/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/51bfdd0036c8f3b785db4a475c8b291d36eb4da16166e28fd94f78e03de156e4/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/0af64f21e44fbc8bc6cd6d0ffadeb1f71f78f1d33abfd3bb076167a88aa7c362/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/6b522640855d5949c032946e3f9e84fb20bfdf5e730db90d04aa19a775de4c1c/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-155:{mountpoint:/var/lib/containers/storage/overlay/1c44e7425e6e5c18defe18654248ff5e7415ef657cdd12c568f25c53aa43974b/merged major:0 minor:155 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/9c9660e4788d8dbd8141c36ead481b824e93a8a30d42d08de9d647a8159f36c4/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-161:{mountpoint:/var/lib/containers/storage/overlay/f80b92ba257e5cfd960adfdfbe2c695a4ac6d59449e0b7426565a4a9026b7d94/merged major:0 minor:161 fsType:overlay blockSize:0} overlay_0-163:{mountpoint:/var/lib/containers/storage/overlay/4dd02f3ac1d9e08a6ee5fbcf4097699c5848973d7c3f02914d96b56038183375/merged major:0 minor:163 fsType:overlay blockSize:0} overlay_0-164:{mountpoint:/var/lib/containers/storage/overlay/b4d22d66b3f688b30bd49b5431350e3275a0b6d3396643a17769fb69563ce364/merged major:0 minor:164 fsType:overlay blockSize:0} overlay_0-169:{mountpoint:/var/lib/containers/storage/overlay/a7b889d8fcfac359660511099579b38b486f85c74a56c0a490ffd3ddd1b941ca/merged major:0 minor:169 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/86bcfb54d0ebebfcbea84507b3fd849be7f29404f9f3153891ae362329302178/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/62b57305bec267610d814b22089e9af9cebdf78a3be111b2e8c890d3f0fb0a10/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/580bbdbac1fa2341244890abe2ed7ea25ec11cb5b4885d5d984433a580a72c5f/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/6c2fffb27650c94cc1e40da7add4512c301aadfe8e4d39e6d83637f470fe7841/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-190:{mountpoint:/var/lib/containers/storage/overlay/7c31b1e3c0b48b9fc53c43991516321945e9c687c989c580d4a23528a43d8f84/merged major:0 minor:190 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/dd6ad5a77649dd26869152c630b639d6de573f816ec5222888ae35635d58f6ed/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/cc6d5a7b3d82ce13e99da48656f89325509bf72b20512d629a92f0ca9109c8b0/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-269:{mountpoint:/var/lib/containers/storage/overlay/dd65a49e2ad06abfedb89a9e28c0ae78ca7093ca2804e5d616f7643467b7cce7/merged major:0 minor:269 fsType:overlay blockSize:0} overlay_0-274:{mountpoint:/var/lib/containers/storage/overlay/239d5dcdfcb236056a8fe4c5e78c36e670676fd531f3d21c298046bb8716a606/merged major:0 minor:274 fsType:overlay blockSize:0} overlay_0-276:{mountpoint:/var/lib/containers/storage/overlay/56ab318dc593586f9bacd502746bb9a308126da7b6ec35fa1a977fbd0e17c0a6/merged major:0 minor:276 fsType:overlay blockSize:0} overlay_0-278:{mountpoint:/var/lib/containers/storage/overlay/6c4f67cfa152e398959eacda2c282a56756c8041135a5eac37531bd5a941b905/merged major:0 minor:278 fsType:overlay blockSize:0} overlay_0-284:{mountpoint:/var/lib/containers/storage/overlay/bc67427313d433eba83faf5562a070bb0a493518e9f8af00c6dcc17a19b4021b/merged major:0 minor:284 fsType:overlay blockSize:0} overlay_0-286:{mountpoint:/var/lib/containers/storage/overlay/e509617f123eaa94cb4b2d2431aeba2971cb9d9d05f755477395c05625ef1ece/merged major:0 minor:286 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/a5f6933afcb7994ae9a7c2fb192cad0cf94747d1914805fb0030e87d52323f93/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-290:{mountpoint:/var/lib/containers/storage/overlay/bb5c22f5faee6990a15350904223425aa1afffe899a815721902bdb9ff602bd9/merged major:0 minor:290 fsType:overlay blockSize:0} overlay_0-292:{mountpoint:/var/lib/containers/storage/overlay/3dfbc0973edfc5f12eaa4cd97919b66e3ce90e9da0147a0254d94971ec3359a8/merged major:0 minor:292 fsType:overlay blockSize:0} overlay_0-294:{mountpoint:/var/lib/containers/storage/overlay/edab676349d6c24c4d041e7f04d1efbf7e5f70dd1a338d0934556525b98e4122/merged major:0 minor:294 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/3ffc052528896d9a3fb508202bb5311b057562a94985655e80d69eaaf20e3588/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/45b5535f80449197995fc6f7b130eeaedbb7e13b67f5baa50c1b36da6c088a25/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-302:{mountpoint:/var/lib/containers/storage/overlay/d08e47a6c14577bb9e091d973518c756075e89d1ea4a66f0d01502b1420dc84b/merged major:0 minor:302 fsType:overlay blockSize:0} overlay_0-306:{mountpoint:/var/lib/containers/storage/overlay/a70290c9c90ec01afc830c47df485938e2b2052b4f6bfd0f3bc7c0e861a3e118/merged major:0 minor:306 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/837747c92019458fb33e3c6e52578632665327293dca4874475bfe3768bfbaad/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-314:{mountpoint:/var/lib/containers/storage/overlay/dad34059db8aba5de319fc4fa68a853b4c4e3c2530655c5ebeae37611009cc97/merged major:0 minor:314 fsType:overlay blockSize:0} overlay_0-316:{mountpoint:/var/lib/containers/storage/overlay/4dbefe635f4926bfd2275bd589996f848efe85c6fd1a8f6471d1e8af9bacbfb1/merged major:0 minor:316 fsType:overlay blockSize:0} overlay_0-317:{mountpoint:/var/lib/containers/storage/overlay/12d6fb0b59d3b10ae58b212b2a1f03fb14415f5dfc245d5a47e85753dda0f750/merged major:0 minor:317 fsType:overlay blockSize:0} overlay_0-319:{mountpoint:/var/lib/containers/storage/overlay/88276a4fbded47975e9fe1ba4bfc68398c8a588af1ec07f29df6b60ce2337e87/merged major:0 minor:319 fsType:overlay blockSize:0} overlay_0-322:{mountpoint:/var/lib/containers/storage/overlay/e3ad1eea8942a4b3eae23f51bdb2e31cf991d89498dc85fa14201686223a3c26/merged major:0 minor:322 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/92a6e5b946d6552c35e12fa20f43be151098af567e41d55ba3a01b9d20135415/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-331:{mountpoint:/var/lib/containers/storage/overlay/6fc01a926225a532fe3b415609713c7cee65a7f62d6e5f43bcf27c31954ff485/merged major:0 minor:331 fsType:overlay blockSize:0} overlay_0-332:{mountpoint:/var/lib/containers/storage/overlay/61fce53fb495953fbf48c3a3966964c0301174e82df27a3ca9d1869e36036f04/merged major:0 minor:332 fsType:overlay blockSize:0} overlay_0-336:{mountpoint:/var/lib/containers/storage/overlay/75e9dff68e5366c4e19cc1ebfbf58745417066192b86f5d465e3105053cf29ef/merged major:0 minor:336 fsType:overlay blockSize:0} overlay_0-343:{mountpoint:/var/lib/containers/storage/overlay/8aee09eab928580e0c163cc0fd9b105bd9d1d8c1345a8abe291bbaaa3b205cec/merged major:0 minor:343 fsType:overlay blockSize:0} overlay_0-345:{mountpoint:/var/lib/containers/storage/overlay/1f1e82e4563a5814c21e28e624c429b323092aeeecce7b5745bc96855f641c1a/merged major:0 minor:345 fsType:overlay blockSize:0} overlay_0-346:{mountpoint:/var/lib/containers/storage/overlay/010ff84d1f80058afa586125e1b11c12cf4a5de53cc0f9dbcb0195ab37743648/merged major:0 minor:346 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/e126aa32fdf0f4a24527e2d637ff685ead80134dc0fac04fef6a1f8972bd5c9f/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/8e4eabfafaa5edc084b3a8bf2258c694628cef545859d8a103887b8527564254/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-353:{mountpoint:/var/lib/containers/storage/overlay/c075bf9746d05b16e7f09cbf92e9d8ea00870eb3ee2641176bb4c2710478e0cb/merged major:0 minor:353 fsType:overlay blockSize:0} overlay_0-354:{mountpoint:/var/lib/containers/storage/overlay/81b219ab14d481dabedb22405e0cc7d119ff34b499ee0ee84d1fd80b340222b2/merged major:0 minor:354 fsType:overlay blockSize:0} overlay_0-364:{mountpoint:/var/lib/containers/storage/overlay/e6b62fba8f98d268c025ec6267399778d2474aac4fd27720f86335b8cd507a30/merged major:0 minor:364 fsType:overlay blockSize:0} overlay_0-373:{mountpoint:/var/lib/containers/storage/overlay/231155c5e264309f4844a609f361302d2c129fe10150bc285c37b95d8ee687d8/merged major:0 minor:373 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/f14e6b639cc0b862889f99b6a2f63794772f598f11ab5bf11ff163f6f4024e97/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-377:{mountpoint:/var/lib/containers/storage/overlay/d0905b80fc9758f26de1b6173609d54d273a26dedea07bc28b81cb0b05b88edf/merged major:0 minor:377 fsType:overlay blockSize:0} overlay_0-378:{mountpoint:/var/lib/containers/storage/overlay/6183c64c959e42aab8776f61884c0f6d01a28b4f0fc3cdfaa128b4659d5aa57b/merged major:0 minor:378 fsType:overlay blockSize:0} overlay_0-380:{mountpoint:/var/lib/containers/storage/overlay/f1da5c4bcbba77de296613379304f575b57f43b9058f19e5fcf913d7b92339b5/merged major:0 minor:380 fsType:overlay blockSize:0} overlay_0-383:{mountpoint:/var/lib/containers/storage/overlay/ec7c93f5672822caea82ee451783794b22837bbf11974910c9902e11e2dff1cd/merged major:0 minor:383 fsType:overlay blockSize:0} overlay_0-384:{mountpoint:/var/lib/containers/storage/overlay/87a28506d40fb523b9d3abd71f8a0424e9b88ddac464f1b227e7349ab503f3a4/merged major:0 minor:384 fsType:overlay blockSize:0} overlay_0-386:{mountpoint:/var/lib/containers/storage/overlay/e3c5486594b6ad08ec97bb253fc7992df244f160c5e0304ba72cd01231c35913/merged major:0 minor:386 fsType:overlay blockSize:0} overlay_0-400:{mountpoint:/var/lib/containers/storage/overlay/9c52d7210adba177d935880d297a6972f11066235ba5bb7a42fa4535c6dc590d/merged major:0 minor:400 fsType:overlay blockSize:0} overlay_0-403:{mountpoint:/var/lib/containers/storage/overlay/cfa44dbeb8d93193ab49b5d3a536b6083f5f0ffde2df8f6e21f37e7517316568/merged major:0 minor:403 fsType:overlay blockSize:0} overlay_0-413:{mountpoint:/var/lib/containers/storage/overlay/302a2f4a463ea323891fd6e5ea4a6ad5bf8f4a3cefa8d9ee58144c6dcf30a2a8/merged major:0 minor:413 fsType:overlay blockSize:0} overlay_0-415:{mountpoint:/var/lib/containers/storage/overlay/b6a6937cdc66c96bae9cc7d5f7cc34a6a693d9ef2f9c2d4825bb97ed8557127e/merged major:0 minor:415 fsType:overlay blockSize:0} overlay_0-417:{mountpoint:/var/lib/containers/storage/overlay/ae9eaf1298a1f7a7cba762b5e2bc952ce62a07fd3b9178c1d5a71ff771f16d97/merged major:0 minor:417 fsType:overlay blockSize:0} overlay_0-42:{mountpoint:/var/lib/containers/storage/overlay/5d10c71a7a2cf5bda84bf2b1f11b37740cf87f4ccbb72d136b956d28fd643fb0/merged major:0 minor:42 fsType:overlay blockSize:0} overlay_0-424:{mountpoint:/var/lib/containers/storage/overlay/1de151b17b11390e949207cc7073aa83d978f3fbf17d4b639bb7969c7740e34b/merged major:0 minor:424 fsType:overlay blockSize:0} overlay_0-427:{mountpoint:/var/lib/containers/storage/overlay/6c1b79d730c69b4e26ee29391c0d7683e73a2dc980551f1322c205aac9eee6e6/merged major:0 minor:427 fsType:overlay blockSize:0} overlay_0-446:{mountpoint:/var/lib/containers/storage/overlay/98c5269fe921cbe81b50934e1e1b9ef7aed24c5d745e19de0416d9e7e951fc8a/merged major:0 minor:446 fsType:overlay blockSize:0} overlay_0-447:{mountpoint:/var/lib/containers/storage/overlay/0b378b84deff07e4ab2d3fdd7022bafa28acad42ccf51e8cbdd0a8316a554ee2/merged major:0 minor:447 fsType:overlay blockSize:0} overlay_0-451:{mountpoint:/var/lib/containers/storage/overlay/a75ec3384b6a5dc6995be6704d818ca9aaf1c20e668c123c5698e237c75514c4/merged major:0 minor:451 fsType:overlay blockSize:0} overlay_0-459:{mountpoint:/var/lib/containers/storage/overlay/a6432ccdfea353c4bceef75648af9d8eefe0c9c1a459ef5ed6e0fc6d676abc58/merged major:0 minor:459 fsType:overlay blockSize:0} overlay_0-497:{mountpoint:/var/lib/containers/storage/overlay/e89a008fd6df80d4f4e31a0a43b31c5ebc37d1fe57a10adefa9ee4aa6d41ba54/merged major:0 minor:497 fsType:overlay blockSize:0} overlay_0-498:{mountpoint:/var/lib/containers/storage/overlay/40bd9cd99dc6fe8a963e9557ca7fe5e536cbbfc0f8228e79fc48637db882da18/merged major:0 minor:498 fsType:overlay blockSize:0} overlay_0-500:{mountpoint:/var/lib/containers/storage/overlay/30f90021df0bc822b1b80dc6c827bc5f0d5f7dd743c03569c4e868de2df65a34/merged major:0 minor:500 fsType:overlay blockSize:0} overlay_0-502:{mountpoint:/var/lib/containers/storage/overlay/9b3e0a913dde26abd10787ae911e6fd9d3e85c353c84441e9111d78d5a1a64fe/merged major:0 minor:502 fsType:overlay blockSize:0} overlay_0-504:{mountpoint:/var/lib/containers/storage/overlay/96dec715b736ef293303d976782a89047b1e07b6119d752e61e6fd952d8801a1/merged major:0 minor:504 fsType:overlay blockSize:0} overlay_0-506:{mountpoint:/var/lib/containers/storage/overlay/31d226901634e5020a2b3df7bee5468728f982cf0753e71714df384ad32d8e21/merged major:0 minor:506 fsType:overlay blockSize:0} overlay_0-508:{mountpoint:/var/lib/containers/storage/overlay/96ad038a42a3d5b8c84244eaeb586686bb3607713357e7da6b4d143f802ddccf/merged major:0 minor:508 fsType:overlay blockSize:0} overlay_0-510:{mountpoint:/var/lib/containers/storage/overlay/84063fd69e5a1e5f1058c682490ea6976a97266700f09c153079e7b0ba33b43a/merged major:0 minor:510 fsType:overlay blockSize:0} overlay_0-512:{mountpoint:/var/lib/containers/storage/overlay/39b3ac0c7553f12b393132727ea1770eac3c1c416e9c68f66aae4a5a5b1700a6/merged major:0 minor:512 fsType:overlay blockSize:0} overlay_0-514:{mountpoint:/var/lib/containers/storage/overlay/d30af54df813d56c84a6127b15b21da4899f44a2e54f67d990db21288513cba3/merged major:0 minor:514 fsType:overlay blockSize:0} overlay_0-516:{mountpoint:/var/lib/containers/storage/overlay/7f9a77abfa7bd9a60453769c7e8dc55969c54ed5ac39e1e79160e3ca96ab5d0f/merged major:0 minor:516 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/2e2c07c409b74c8bff95f161226761daa802e79d9221a2627d9fd94b254b0910/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/fdb0146dc3d40db67beaf73c7cf14c36c3912aa50521ca38ceefd4305daef3c7/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-524:{mountpoint:/var/lib/containers/storage/overlay/37f069eaa00edf29532bc2a91b1d939bdbf48b06c4995f40c4ce18f4981e659b/merged major:0 minor:524 fsType:overlay blockSize:0} overlay_0-527:{mountpoint:/var/lib/containers/storage/overlay/700f4c4c13f7edbe90524cc83fcf0be792f2b623f637b088ee5bf9ecfb68166d/merged major:0 minor:527 fsType:overlay blockSize:0} overlay_0-534:{mountpoint:/var/lib/containers/storage/overlay/5136667fd1330053ee00037ca585fac0210bda1bd7b61a4265e4a1da59ddffdb/merged major:0 minor:534 fsType:overlay blockSize:0} overlay_0-538:{mountpoint:/var/lib/containers/storage/overlay/fe3d848aa62acd66971fdadb655b4ee1e235f47b603e4213145c5a6081c8b18f/merged major:0 minor:538 fsType:overlay blockSize:0} overlay_0-550:{mountpoint:/var/lib/containers/storage/overlay/58f4afcf39fa02081e2ad56b1f4f8a896502c05794151bb0dba24e9162c387d9/merged major:0 minor:550 fsType:overlay blockSize:0} overlay_0-552:{mountpoint:/var/lib/containers/storage/overlay/3170d57f9fca7f011657730bfc667d89b02d77163caf168afcca3668f8fdac59/merged major:0 minor:552 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/6d7fb23a37bc4106026e3becdb5da963f255cd603d365b7fd27ab75b3394f71e/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-565:{mountpoint:/var/lib/containers/storage/overlay/06a056d6892ff4b0a551e78b7fa634b454efd41d8fdcf27f5e17fb5eabdeb4f0/merged major:0 minor:565 fsType:overlay blockSize:0} overlay_0-581:{mountpoint:/var/lib/containers/storage/overlay/e253f007a4d69c922252793fee2e21432531cf046f7918709696e400fda0d9de/merged major:0 minor:581 fsType:overlay blockSize:0} overlay_0-591:{mountpoint:/var/lib/containers/storage/overlay/10ba0c443f3aa59e9b454f8fda00cd090602481a3f59472572e851d4c62898ea/merged major:0 minor:591 fsType:overlay blockSize:0} overlay_0-600:{mountpoint:/var/lib/containers/storage/overlay/ee00b2bff89f3702cf5888d253783f2203449bcb852fcafa0325e9b3e04eca04/merged major:0 minor:600 fsType:overlay blockSize:0} overlay_0-607:{mountpoint:/var/lib/containers/storage/overlay/ea91acd11e8732a7228b653764a5087c2abf83ee4dff09798e66142940a4264c/merged major:0 minor:607 fsType:overlay blockSize:0} overlay_0-608:{mountpoint:/var/lib/containers/storage/overlay/ff7cad0c4038d54d66723b8ca8a2d7657f906c37013493a30debe22e338a3cd4/merged major:0 minor:608 fsType:overlay blockSize:0} overlay_0-610:{mountpoint:/var/lib/containers/storage/overlay/b17799989dd8662d10113be78d76ea2d8e52e26c4f92562acc692bfd1ca86434/merged major:0 minor:610 fsType:overlay blockSize:0} overlay_0-618:{mountpoint:/var/lib/containers/storage/overlay/1bcd0f432b6b03ea20d652b75bafd19a2fca04d148193b02d2ad7664efce4af0/merged major:0 minor:618 fsType:overlay blockSize:0} overlay_0-621:{mountpoint:/var/lib/containers/storage/overlay/0e389c68e3f7ae7bc5e78374763c71418b95e1f047eebc261c545812776e01cc/merged major:0 minor:621 fsType:overlay blockSize:0} overlay_0-623:{mountpoint:/var/lib/containers/storage/overlay/4e9a7a086bcb612a832af39df9b09e457dbb3cfa61c83150c69df861edeadeb8/merged major:0 minor:623 fsType:overlay blockSize:0} overlay_0-626:{mountpoint:/var/lib/containers/storage/overlay/048919ed04c3a96c0c2213159d0d0ab07b56ff0559c507944e7285a361779dbc/merged major:0 minor:626 fsType:overlay blockSize:0} overlay_0-628:{mountpoint:/var/lib/containers/storage/overlay/574eced7b653aa4673662116f3beda37071758af812f8ab255457d593913b5c6/merged major:0 minor:628 fsType:overlay blockSize:0} overlay_0-649:{mountpoint:/var/lib/containers/storage/overlay/a58150ed1eeeddb6a73f35f7b5caecefa9574dcbaf0959f3c4988d7753470ba4/merged major:0 minor:649 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/8e2ae53156954fe32bbf93968f5dab062fd5b63ecd9d561c24e9313b49e76983/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-659:{mountpoint:/var/lib/containers/storage/overlay/dab5039775968e9e0cc8f1655cfc3b80d1e70ec5b8e36c681d641d250e1590c9/merged major:0 minor:659 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/5cb5ec39cf1bb9177fc3311839916df0c7f900372794135242d5e04b29842299/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-668:{mountpoint:/var/lib/containers/storage/overlay/91caf24587e2d714533c82b14c686dd263349bbbd4fb697627d7a46b567fced3/merged major:0 minor:668 fsType:overlay blockSize:0} overlay_0-67:{mountpoint:/var/lib/containers/storage/overlay/51c878e5aa475cb27dd02d18bb0d05604ade0233c72134b678c29a2682fe940e/merged major:0 minor:67 fsType:overlay blockSize:0} overlay_0-680:{mountpoint:/var/lib/containers/storage/overlay/e1a3d28ea3a0a0e9c210f3a2dc513f35d5fad9af162b11eb3a58db271015b0da/merged major:0 minor:680 fsType:overlay blockSize:0} overlay_0-684:{mountpoint:/var/lib/containers/storage/overlay/6a059d72b428ea9233f8e24e63e1a09d0483e21a92662d66526d44c43f6bdec8/merged major:0 minor:684 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/40fd61dabcfd032d15f10c34ad1800d159c8a30f30126b393006ad28e7eba440/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-697:{mountpoint:/var/lib/containers/storage/overlay/9870b2cb0092d043d18b8ee28d3db3174e299111f85f596b89321fe546087ee7/merged major:0 minor:697 fsType:overlay blockSize:0} overlay_0-703:{mountpoint:/var/lib/containers/storage/overlay/df01817b2a54b7a5b6a4a5d562444c2af8a14b665017f16a5235815c9fc36021/merged major:0 minor:703 fsType:overlay blockSize:0} overlay_0-704:{mountpoint:/var/lib/containers/storage/overlay/9e96f21b0a723c98373c330e7a562d03b9ad0d3a42e0b6d701de97c537134e70/merged major:0 minor:704 fsType:overlay blockSize:0} overlay_0-717:{mountpoint:/var/lib/containers/storage/overlay/4f599c72ba9f28671be7b55d0a0bf92d9a78bf80fd7e4a6781b6b787ee75afde/merged major:0 minor:717 fsType:overlay blockSize:0} overlay_0-721:{mountpoint:/var/lib/containers/storage/overlay/fea2e9c34543a6cf43991650ff32b83ae623350c8de6b25f10ac0b4dde3d7931/merged major:0 minor:721 fsType:overlay blockSize:0} overlay_0-730:{mountpoint:/var/lib/containers/storage/overlay/755ef254b8ebf97a1e1902eb94b4e10f6fa89760d3e1dacac02eb910fcdc0d4d/merged major:0 minor:730 fsType:overlay blockSize:0} overlay_0-734:{mountpoint:/var/lib/containers/storage/overlay/7db1836b1e3cfba849710fc6940a106e658c3a5a3cce224a5aa480ab11f584b0/merged major:0 minor:734 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/7e54b259cace08b76b8dfec801d2483a87bf6df4c9915e9bede6835df602e96e/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-750:{mountpoint:/var/lib/containers/storage/overlay/b1a41347b69f09082e7f9d610bf1e3a5c9d48388b91cbce9c1e6b0d3c69658c7/merged major:0 minor:750 fsType:overlay blockSize:0} overlay_0-754:{mountpoint:/var/lib/containers/storage/overlay/1da011b3b03982e62a052bd4cecd52e7a1cd7f6772e3dd55029e402d7fce9390/merged major:0 minor:754 fsType:overlay blockSize:0} overlay_0-757:{mountpoint:/var/lib/containers/storage/overlay/94cb907e00db04ef01970ce8ed17ffb6bfcafb0ad883dd92dd50244153c08c86/merged major:0 minor:757 fsType:overlay blockSize:0} overlay_0-758:{mountpoint:/var/lib/containers/storage/overlay/4f1432628bbf970a2205a0131ff7f0e395a21f6bd7dae5aa2ba5fd10a350b046/merged major:0 minor:758 fsType:overlay blockSize:0} overlay_0-77:{mountpoint:/var/lib/containers/storage/overlay/97a82b92fd74e1499d8451f55253cf728d39dc806a6d62433e03d6471a6141e4/merged major:0 minor:77 fsType:overlay blockSize:0} overlay_0-796:{mountpoint:/var/lib/containers/storage/overlay/3c111a13ff242264cf657f75bc2794e68115cd599fea2e1c096e3b86ea4fbb34/merged major:0 minor:796 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/02ed34960d7958b136f25f7d7f18ce365ee02c6a41c9d7a12db63be48ff92c86/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/9eb09809524e5e73f07a8aa1f44b011a55f6d75154a43843c9d74cb1dcb2eae6/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-823:{mountpoint:/var/lib/containers/storage/overlay/9ebe6716a5bbfe5f1c064425a7eb94a983b3ca68485322e4aaf2cc11add3a2fa/merged major:0 minor:823 fsType:overlay blockSize:0} overlay_0-825:{mountpoint:/var/lib/containers/storage/overlay/ccc67a78521ee228f68f707b2b0e175bcd4ca3de9ad2b623502a54645c818968/merged major:0 minor:825 fsType:overlay blockSize:0} overlay_0-862:{mountpoint:/var/lib/containers/storage/overlay/a3d6a25f09a602098c0b72fbc01b0319f141779c421a2bf28e123eaf963c1db2/merged major:0 minor:862 fsType:overlay blockSize:0} overlay_0-864:{mountpoint:/var/lib/containers/storage/overlay/ae11e14f1e68cb6aefeaf83bcbfeac9c04044da021a2178d17ac07c599f46756/merged major:0 minor:864 fsType:overlay blockSize:0} overlay_0-867:{mountpoint:/var/lib/containers/storage/overlay/264dd6be46e61c66063fee62ac36a96b672705edc4c235df2df8d966c77cf11d/merged major:0 minor:867 fsType:overlay blockSize:0} overlay_0-869:{mountpoint:/var/lib/containers/storage/overlay/39b77b7ff2a558cc41c2f6628ed190f3d0048663e3dfffa1d54bded5934c51e3/merged major:0 minor:869 fsType:overlay blockSize:0} overlay_0-871:{mountpoint:/var/lib/containers/storage/overlay/18114c270589bee8dd1dcd6add7834b01c2130c202a515190fa2aa8e9b77b196/merged major:0 minor:871 fsType:overlay blockSize:0} overlay_0-873:{mountpoint:/var/lib/containers/storage/overlay/54fecd6b23f3fcb320268dd454d04d9c610d45a3d646ceedc4584704733db81c/merged major:0 minor:873 fsType:overlay blockSize:0} overlay_0-875:{mountpoint:/var/lib/containers/storage/overlay/8fbf1695c964cbdeb6716b724ccd74107e4fb248e3595dae0815c19aaf288ad2/merged major:0 minor:875 fsType:overlay blockSize:0} overlay_0-877:{mountpoint:/var/lib/containers/storage/overlay/e5f3613e677b02c607fa4330b5726f11c591134812ac95b485d9ff095841738b/merged major:0 minor:877 fsType:overlay blockSize:0} overlay_0-879:{mountpoint:/var/lib/containers/storage/overlay/6761d57e672d0ce7e9dc707df20803136d2eba26c2c253a91cb0fd522a2e7159/merged major:0 minor:879 fsType:overlay blockSize:0} overlay_0-883:{mountpoint:/var/lib/containers/storage/overlay/374a468792c77077c2a2993452a31573d7a0ddbf08f5a6f4b5623e0a113bfbb3/merged major:0 minor:883 fsType:overlay blockSize:0} overlay_0-884:{mountpoint:/var/lib/containers/storage/overlay/01dc4aa8f79e3b44797fabe97dac473fc88d5b0a2f21d77e7c972de0d1453243/merged major:0 minor:884 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/3e4082de92811b0d32ba33df4d3e093b63e08296ef37b0d1149c1dcdf73edcb3/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-888:{mountpoint:/var/lib/containers/storage/overlay/fcf3fd3ccde152a1a7c7beb02d46e92cfea464c74ca1be5eff075da0a64dad58/merged major:0 minor:888 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/7026cceefacf8ed9a1dfb3ff29a0bd33cf0d07e38dac3258b59979c97299578b/merged major:0 minor:89 fsType:overlay blockSize:0} overlay_0-893:{mountpoint:/var/lib/containers/storage/overlay/1bd64d6e6f985165915ed5c4e6c228150e1d96ffed03518475a08b2769e02a88/merged major:0 minor:893 fsType:overlay blockSize:0} overlay_0-895:{mountpoint:/var/lib/containers/storage/overlay/ebc3b69cdf2d472fd9f9abe0800c7bca4ff4ec01d7233b8a7fe8e17f7c80bb23/merged major:0 minor:895 fsType:overlay blockSize:0} overlay_0-911:{mountpoint:/var/lib/containers/storage/overlay/6782c74f004e10fe19d24f7f497c9c75d0d8e7caf64af9f66ce2a9aa11e4f7f4/merged major Mar 19 12:09:52.773850 master-0 kubenswrapper[29612]: :0 minor:911 fsType:overlay blockSize:0} overlay_0-916:{mountpoint:/var/lib/containers/storage/overlay/831b4d98aae8a81160a02de51554c23421d3ae8358b40840a4fb21f64d979e28/merged major:0 minor:916 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/e73ead7748d25ac743c9f6af4909dc6680ca8609a7949391fcfb45502b67d3d1/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-930:{mountpoint:/var/lib/containers/storage/overlay/8f6fa0a26f9d8a737d9659d415495c62e19bb39ad520241c957a102fe46d0be8/merged major:0 minor:930 fsType:overlay blockSize:0} overlay_0-939:{mountpoint:/var/lib/containers/storage/overlay/535a1b6f42592d35aa9588c9a639192ffa172df7e2ce89bd389cc57ba3950f70/merged major:0 minor:939 fsType:overlay blockSize:0} overlay_0-941:{mountpoint:/var/lib/containers/storage/overlay/c573edd34cda877ba80df87c2a742ca684f33622fa7187ef45d8b4c7ce114cc7/merged major:0 minor:941 fsType:overlay blockSize:0} overlay_0-943:{mountpoint:/var/lib/containers/storage/overlay/5eaf6075260a1e0af139453f5a6b02d9658579161c135089d580d30ec5eda435/merged major:0 minor:943 fsType:overlay blockSize:0} overlay_0-945:{mountpoint:/var/lib/containers/storage/overlay/186410f31802bac7f85d582de18990c0cd84dbef8631208bbb48105d5b4fdc46/merged major:0 minor:945 fsType:overlay blockSize:0} overlay_0-949:{mountpoint:/var/lib/containers/storage/overlay/b0b2e81065870c08612f0424f905025646032760df9466110e593a9fd3da04d4/merged major:0 minor:949 fsType:overlay blockSize:0} overlay_0-953:{mountpoint:/var/lib/containers/storage/overlay/9c1bb6400f9708f54c0cf42aaf4358c3786f654128d1092d76fe54a4ab77553b/merged major:0 minor:953 fsType:overlay blockSize:0} overlay_0-956:{mountpoint:/var/lib/containers/storage/overlay/2fd229b1f5434203baf8750a9fdb94787c45ba7dd32ece467f5d5c6246f0d65f/merged major:0 minor:956 fsType:overlay blockSize:0} overlay_0-959:{mountpoint:/var/lib/containers/storage/overlay/163e62d6b1c2c75ada50ec0c74282d66b6e6eb2d24e5d178dd136b7a04088150/merged major:0 minor:959 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/5e47284095f059f1a9d7824f8acd6dd7841076df3308eeb83d028bee75959eae/merged major:0 minor:96 fsType:overlay blockSize:0} overlay_0-961:{mountpoint:/var/lib/containers/storage/overlay/64115b42cad5065d9274cfc82f4bfde1cb2ccaff22ba1e381077f7efa330b08a/merged major:0 minor:961 fsType:overlay blockSize:0} overlay_0-968:{mountpoint:/var/lib/containers/storage/overlay/5222056f194d1315b14512e7714382910469dc6b0c583f0a2a27ecd42069870c/merged major:0 minor:968 fsType:overlay blockSize:0} overlay_0-972:{mountpoint:/var/lib/containers/storage/overlay/757a4f7bab0408a7f912aaf8cab1a90990885950d40e4687d1b0064a3d573d90/merged major:0 minor:972 fsType:overlay blockSize:0} overlay_0-98:{mountpoint:/var/lib/containers/storage/overlay/6cad31e1029895fad442f2d693330acc01e5416b2ffad59524729bc42c1d8bdf/merged major:0 minor:98 fsType:overlay blockSize:0} overlay_0-986:{mountpoint:/var/lib/containers/storage/overlay/85632ec8813bdea4ea04462322e78f964e250415857edf79c1449ef7f8055748/merged major:0 minor:986 fsType:overlay blockSize:0} overlay_0-997:{mountpoint:/var/lib/containers/storage/overlay/26e41b89aa026f60602aedadf06d86f99e46d08e00df1f064b8cee7533320d99/merged major:0 minor:997 fsType:overlay blockSize:0}] Mar 19 12:09:52.816003 master-0 kubenswrapper[29612]: I0319 12:09:52.812773 29612 manager.go:217] Machine: {Timestamp:2026-03-19 12:09:52.811900947 +0000 UTC m=+0.126069988 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:d9abaffc4eed451b8d7e70f7431f4a5e SystemUUID:d9abaffc-4eed-451b-8d7e-70f7431f4a5e BootID:35523968-247e-452c-b4d2-5f84e7547d23 Filesystems:[{Device:overlay_0-1035 DeviceMajor:0 DeviceMinor:1035 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~projected/kube-api-access-hpdhv DeviceMajor:0 DeviceMinor:1161 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e/userdata/shm DeviceMajor:0 DeviceMinor:243 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~projected/kube-api-access-4sprd DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f/userdata/shm DeviceMajor:0 DeviceMinor:480 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-704 DeviceMajor:0 DeviceMinor:704 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1031 DeviceMajor:0 DeviceMinor:1031 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:432 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1033 DeviceMajor:0 DeviceMinor:1033 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f65be54c-d851-4daa-93a0-8a3947da5e26/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:325 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-446 DeviceMajor:0 DeviceMinor:446 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1121 DeviceMajor:0 DeviceMinor:1121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:494 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-502 DeviceMajor:0 DeviceMinor:502 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f/userdata/shm DeviceMajor:0 DeviceMinor:860 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-956 DeviceMajor:0 DeviceMinor:956 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-565 DeviceMajor:0 DeviceMinor:565 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf/volumes/kubernetes.io~projected/kube-api-access-x9g5b DeviceMajor:0 DeviceMinor:832 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-77 DeviceMajor:0 DeviceMinor:77 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/65b629a3-ea10-4517-b762-b18b22fab2a9/volumes/kubernetes.io~projected/kube-api-access-q9qkq DeviceMajor:0 DeviceMinor:304 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~projected/kube-api-access-bssz9 DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/kube-api-access-g4qb9 DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-684 DeviceMajor:0 DeviceMinor:684 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~projected/kube-api-access-8765q DeviceMajor:0 DeviceMinor:929 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:1073 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~projected/kube-api-access-x9hnq DeviceMajor:0 DeviceMinor:1099 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:701 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~projected/kube-api-access-mm4kn DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bea42722-d7af-4938-9f3a-da57bd056f96/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:272 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-516 DeviceMajor:0 DeviceMinor:516 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-867 DeviceMajor:0 DeviceMinor:867 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3/userdata/shm DeviceMajor:0 DeviceMinor:1029 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1109 DeviceMajor:0 DeviceMinor:1109 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3a185e40-e97b-49e9-aea3-5f170401a9e3/volumes/kubernetes.io~projected/kube-api-access-2v4xb DeviceMajor:0 DeviceMinor:818 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4e7b04dc-e85d-41b8-959c-87b1b24731a1/volumes/kubernetes.io~projected/kube-api-access-9h4cz DeviceMajor:0 DeviceMinor:410 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-649 DeviceMajor:0 DeviceMinor:649 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1092 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94/userdata/shm DeviceMajor:0 DeviceMinor:1105 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0f2278c1-05cf-469c-a13c-762b7a790b38/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:828 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:928 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~projected/kube-api-access-scp9p DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-322 DeviceMajor:0 DeviceMinor:322 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~projected/kube-api-access-z7cz7 DeviceMajor:0 DeviceMinor:1098 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc/userdata/shm DeviceMajor:0 DeviceMinor:264 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-274 DeviceMajor:0 DeviceMinor:274 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-403 DeviceMajor:0 DeviceMinor:403 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~projected/kube-api-access-h95pm DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-972 DeviceMajor:0 DeviceMinor:972 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/57d19c68-b469-4a6f-abe0-3a0eff32639c/volumes/kubernetes.io~projected/kube-api-access-v8qjv DeviceMajor:0 DeviceMinor:1020 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-527 DeviceMajor:0 DeviceMinor:527 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-538 DeviceMajor:0 DeviceMinor:538 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-986 DeviceMajor:0 DeviceMinor:986 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/volumes/kubernetes.io~projected/kube-api-access-rvp77 DeviceMajor:0 DeviceMinor:833 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-883 DeviceMajor:0 DeviceMinor:883 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-621 DeviceMajor:0 DeviceMinor:621 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bcc25a4a-521a-4139-a10c-d5b8bba6e359/volumes/kubernetes.io~projected/kube-api-access-2vhrb DeviceMajor:0 DeviceMinor:845 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-997 DeviceMajor:0 DeviceMinor:997 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:460 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-427 DeviceMajor:0 DeviceMinor:427 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41/userdata/shm DeviceMajor:0 DeviceMinor:255 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab/userdata/shm DeviceMajor:0 DeviceMinor:487 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f20fa38da6728fdc70e1b66426e311b675e090aa8e87afaac1c37a5210cd1c8/userdata/shm DeviceMajor:0 DeviceMinor:816 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-417 DeviceMajor:0 DeviceMinor:417 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-114 DeviceMajor:0 DeviceMinor:114 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ef597ecc3cb239a3f4a10a1a4b99299b44a320ef76f5f894771c1233b3937cda/userdata/shm DeviceMajor:0 DeviceMinor:309 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:433 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a297c2fb1d857d07bfeb6ac6ca57bb65e15ebd85864af6b5b40028961499f08b/userdata/shm DeviceMajor:0 DeviceMinor:589 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0/volumes/kubernetes.io~projected/kube-api-access-rn94k DeviceMajor:0 DeviceMinor:936 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6/userdata/shm DeviceMajor:0 DeviceMinor:1025 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1107 DeviceMajor:0 DeviceMinor:1107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/eb274dd5-e98e-4485-a050-7bec560a0daa/volumes/kubernetes.io~projected/kube-api-access-jsvpz DeviceMajor:0 DeviceMinor:827 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1051 DeviceMajor:0 DeviceMinor:1051 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~projected/kube-api-access-m6rhj DeviceMajor:0 DeviceMinor:249 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3/userdata/shm DeviceMajor:0 DeviceMinor:519 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a77515f16c17046f407c9f07e5609964e9814a68e8f45a75bb351406b4dcfc20/userdata/shm DeviceMajor:0 DeviceMinor:705 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:455 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-941 DeviceMajor:0 DeviceMinor:941 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-945 DeviceMajor:0 DeviceMinor:945 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-346 DeviceMajor:0 DeviceMinor:346 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1117 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba/userdata/shm DeviceMajor:0 DeviceMinor:411 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-383 DeviceMajor:0 DeviceMinor:383 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:492 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~projected/kube-api-access-2m4gg DeviceMajor:0 DeviceMinor:495 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/294f59c88ee9d0ef672e140bfaeda4c298629a073fa82f34bad6f87890b02c82/userdata/shm DeviceMajor:0 DeviceMinor:788 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:406 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-895 DeviceMajor:0 DeviceMinor:895 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-968 DeviceMajor:0 DeviceMinor:968 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc/volumes/kubernetes.io~projected/kube-api-access-9kf8c DeviceMajor:0 DeviceMinor:95 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a5cf6404-1640-43d3-8018-03c62c8338c5/volumes/kubernetes.io~projected/kube-api-access-b48q2 DeviceMajor:0 DeviceMinor:696 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b/userdata/shm DeviceMajor:0 DeviceMinor:854 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/volumes/kubernetes.io~projected/kube-api-access-v5cc5 DeviceMajor:0 DeviceMinor:805 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/63e81c6a-b723-4c9a-b471-226fefa9c7a7/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:820 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-608 DeviceMajor:0 DeviceMinor:608 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/da05a78204256ccd4edc262bb9b930e6804a73e9ac544e9244d9c5d71173e2a9/userdata/shm DeviceMajor:0 DeviceMinor:435 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-668 DeviceMajor:0 DeviceMinor:668 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-864 DeviceMajor:0 DeviceMinor:864 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b69a314d-4884-4f2c-a48e-1cf68c5aaef4/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:924 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/d979be56-f948-4252-ab6b-9bfc48dc4187/volumes/kubernetes.io~projected/kube-api-access-w7bl2 DeviceMajor:0 DeviceMinor:749 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c30aed6efb2687e8c9c0787c5b85c3ecc5ac073733a122b5233bd897825e7a16/userdata/shm DeviceMajor:0 DeviceMinor:1100 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3/userdata/shm DeviceMajor:0 DeviceMinor:465 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-155 DeviceMajor:0 DeviceMinor:155 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-316 DeviceMajor:0 DeviceMinor:316 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1044 DeviceMajor:0 DeviceMinor:1044 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1077 DeviceMajor:0 DeviceMinor:1077 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-581 DeviceMajor:0 DeviceMinor:581 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-703 DeviceMajor:0 DeviceMinor:703 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1036 DeviceMajor:0 DeviceMinor:1036 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~projected/kube-api-access-f7qsg DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-451 DeviceMajor:0 DeviceMinor:451 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/854a387c5ef79581e793cea9af478d1025e855bc5db72359436a67d2aec956ce/userdata/shm DeviceMajor:0 DeviceMinor:850 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-879 DeviceMajor:0 DeviceMinor:879 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-42 DeviceMajor:0 DeviceMinor:42 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43/userdata/shm DeviceMajor:0 DeviceMinor:250 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-498 DeviceMajor:0 DeviceMinor:498 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:453 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35/userdata/shm DeviceMajor:0 DeviceMinor:1164 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee/userdata/shm DeviceMajor:0 DeviceMinor:94 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-659 DeviceMajor:0 DeviceMinor:659 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924/userdata/shm DeviceMajor:0 DeviceMinor:1123 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403/userdata/shm DeviceMajor:0 DeviceMinor:84 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5f2d1d03-7427-45a0-a917-173bf9df9902/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1097 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-888 DeviceMajor:0 DeviceMinor:888 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-161 DeviceMajor:0 DeviceMinor:161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~projected/kube-api-access-gjj2t DeviceMajor:0 DeviceMinor:542 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-373 DeviceMajor:0 DeviceMinor:373 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e47d3935-d96e-4fd2-abfc-99bff7eac5e0/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:844 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-98 DeviceMajor:0 DeviceMinor:98 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6e8c4676-05af-4478-a44b-1bdcbefdea0f/volumes/kubernetes.io~projected/kube-api-access-pdmj8 DeviceMajor:0 DeviceMinor:245 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:540 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8117f6b0-0b9f-4a1d-9807-bf73c19f388b/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:747 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1053 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58a4e3af-c5e1-4487-92a2-81dfb696695e/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:541 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes/kubernetes.io~projected/kube-api-access-64sr2 DeviceMajor:0 DeviceMinor:580 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-862 DeviceMajor:0 DeviceMinor:862 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d/userdata/shm DeviceMajor:0 DeviceMinor:1027 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1111 DeviceMajor:0 DeviceMinor:1111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:443 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217/userdata/shm DeviceMajor:0 DeviceMinor:448 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-552 DeviceMajor:0 DeviceMinor:552 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fbde1a69-b56c-467e-a291-9a83c448346f/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:829 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ccf8a3de86952995f531b01f4ffdc152c7bc1633eadc15477217fefaa028047/userdata/shm DeviceMajor:0 DeviceMinor:848 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:737 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1175 DeviceMajor:0 DeviceMinor:1175 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1177 DeviceMajor:0 DeviceMinor:1177 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~projected/kube-api-access-f99wt DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-514 DeviceMajor:0 DeviceMinor:514 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-734 DeviceMajor:0 DeviceMinor:734 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fbde1a69-b56c-467e-a291-9a83c448346f/volumes/kubernetes.io~projected/kube-api-access-j5f65 DeviceMajor:0 DeviceMinor:834 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e/userdata/shm DeviceMajor:0 DeviceMinor:280 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-332 DeviceMajor:0 DeviceMinor:332 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/60081e3b-4fe5-42a3-bfae-0b07853c0e36/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:539 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a5d009a3-5ff2-49f9-9cea-004b99729b77/volumes/kubernetes.io~projected/kube-api-access-z4khz DeviceMajor:0 DeviceMinor:817 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-875 DeviceMajor:0 DeviceMinor:875 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:682 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~projected/kube-api-access-4khlg DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-721 DeviceMajor:0 DeviceMinor:721 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0c804ea0-3483-4506-b138-3bf30016d8d8/volumes/kubernetes.io~projected/kube-api-access-pxjkt DeviceMajor:0 DeviceMinor:367 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/94fa5656-0561-45ab-9ab1-a6b96b8a47d4/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:761 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/60081e3b-4fe5-42a3-bfae-0b07853c0e36/volumes/kubernetes.io~projected/kube-api-access-ctghm DeviceMajor:0 DeviceMinor:543 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~projected/kube-api-access-wgsm4 DeviceMajor:0 DeviceMinor:746 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-911 DeviceMajor:0 DeviceMinor:911 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1173 DeviceMajor:0 DeviceMinor:1173 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-286 DeviceMajor:0 DeviceMinor:286 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:562 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-750 DeviceMajor:0 DeviceMinor:750 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-943 DeviceMajor:0 DeviceMinor:943 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1159 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-377 DeviceMajor:0 DeviceMinor:377 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/003e840c-1b07-4624-8225-e4ffdab74213/volumes/kubernetes.io~projected/kube-api-access-hmbbm DeviceMajor:0 DeviceMinor:655 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f93ee7db-f0ed-470f-a458-86c54e6d064a/volumes/kubernetes.io~projected/kube-api-access-jstrq DeviceMajor:0 DeviceMinor:1019 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4705c8f3-89d4-42f2-ad14-e591c5380b9e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/162b513f-2f67-4946-816b-48f6232dfca0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1186 DeviceMajor:0 DeviceMinor:1186 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-757 DeviceMajor:0 DeviceMinor:757 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-269 DeviceMajor:0 DeviceMinor:269 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-510 DeviceMajor:0 DeviceMinor:510 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-680 DeviceMajor:0 DeviceMinor:680 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/414a48d3cf2977fd972a56aeae2697291bc4403c105ce776bfadef125a18facb/userdata/shm DeviceMajor:0 DeviceMinor:846 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1103 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-132 DeviceMajor:0 DeviceMinor:132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~projected/kube-api-access-rbwcn DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-610 DeviceMajor:0 DeviceMinor:610 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f38df702-a256-4f83-90bb-0c0e477a2ce6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:462 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-500 DeviceMajor:0 DeviceMinor:500 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-916 DeviceMajor:0 DeviceMinor:916 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1054 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-306 DeviceMajor:0 DeviceMinor:306 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8/userdata/shm DeviceMajor:0 DeviceMinor:476 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-600 DeviceMajor:0 DeviceMinor:600 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-319 DeviceMajor:0 DeviceMinor:319 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-873 DeviceMajor:0 DeviceMinor:873 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-930 DeviceMajor:0 DeviceMinor:930 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3e1b71a7-c9fd-4f03-8d0c-f092dac80989/volumes/kubernetes.io~projected/kube-api-access-mgr4k DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-877 DeviceMajor:0 DeviceMinor:877 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-534 DeviceMajor:0 DeviceMinor:534 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b46c6b4-f701-4a7f-bf74-90afe14ad89a/volumes/kubernetes.io~projected/kube-api-access-s98qv DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f233927f2993af718960afb2ae363b672383e871d2174bdf0f0ce3e81e24474f/userdata/shm DeviceMajor:0 DeviceMinor:853 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-871 DeviceMajor:0 DeviceMinor:871 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-754 DeviceMajor:0 DeviceMinor:754 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:258 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/57d19c68-b469-4a6f-abe0-3a0eff32639c/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:1015 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-317 DeviceMajor:0 DeviceMinor:317 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~projected/kube-api-access-6frf6 DeviceMajor:0 DeviceMinor:683 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1162 DeviceMajor:0 DeviceMinor:1162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5/userdata/shm DeviceMajor:0 DeviceMinor:1021 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-364 DeviceMajor:0 DeviceMinor:364 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~projected/kube-api-access-w4ms4 DeviceMajor:0 DeviceMinor:99 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e617e515-a899-41e8-9961-df68e49da4d3/volumes/kubernetes.io~projected/kube-api-access-bptbc DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-623 DeviceMajor:0 DeviceMinor:623 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-953 DeviceMajor:0 DeviceMinor:953 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1081 DeviceMajor:0 DeviceMinor:1081 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1134 DeviceMajor:0 DeviceMinor:1134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~projected/kube-api-access-8ntj9 DeviceMajor:0 DeviceMinor:444 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b0133cb5-e425-40af-a051-71b2362a89c3/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-415 DeviceMajor:0 DeviceMinor:415 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0c58fd5ea7dfc487f516ab9f02025ffdb09d90a9ab5d62ff3f2b99a062549935/userdata/shm DeviceMajor:0 DeviceMinor:851 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-459 DeviceMajor:0 DeviceMinor:459 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1158 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6/volumes/kubernetes.io~projected/kube-api-access-pxt2h DeviceMajor:0 DeviceMinor:402 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/33f6e54f-a750-4c91-b4f5-049275d33cfb/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:458 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-314 DeviceMajor:0 DeviceMinor:314 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1023 DeviceMajor:0 DeviceMinor:1023 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/64136215-8539-4349-977f-6d59deb132ea/volumes/kubernetes.io~projected/kube-api-access-cgtrh DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-961 DeviceMajor:0 DeviceMinor:961 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/518e1c6f-7f46-4dda-84a3-49fec9fa1abe/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a5cf6404-1640-43d3-8018-03c62c8338c5/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:702 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-796 DeviceMajor:0 DeviceMinor:796 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327/userdata/shm DeviceMajor:0 DeviceMinor:952 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:804 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-353 DeviceMajor:0 DeviceMinor:353 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-628 DeviceMajor:0 DeviceMinor:628 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-345 DeviceMajor:0 DeviceMinor:345 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-294 DeviceMajor:0 DeviceMinor:294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-290 DeviceMajor:0 DeviceMinor:290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-278 DeviceMajor:0 DeviceMinor:278 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e90c10f6-36d9-44bd-a4c6-174f12135434/volumes/kubernetes.io~projected/kube-api-access-r2knz DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6b5887fbc5e32ebc2f94544f4a59e3db7e9426484412d0d7e1662fd8761f75d2/userdata/shm DeviceMajor:0 DeviceMinor:371 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1065 DeviceMajor:0 DeviceMinor:1065 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ac946049-4b93-4d8c-bfa6-eacf83160ed1/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-276 DeviceMajor:0 DeviceMinor:276 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6d017409-a362-4e58-87af-d02b3834fdd4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/kube-api-access-bwp24 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-497 DeviceMajor:0 DeviceMinor:497 Capacity:214143315968 Type:vfs Inodes:104594880 Mar 19 12:09:52.816474 master-0 kubenswrapper[29612]: HasInodes:true} {Device:/var/lib/kubelet/pods/4e7b04dc-e85d-41b8-959c-87b1b24731a1/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:405 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3a185e40-e97b-49e9-aea3-5f170401a9e3/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:815 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-292 DeviceMajor:0 DeviceMinor:292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b1eb508b26c0e3a04c674bb8b166ab63a387c4e22fe8322ed85b0094f61f22e8/userdata/shm DeviceMajor:0 DeviceMinor:438 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4d790b42-2182-4b9c-8391-285e938715cc/volumes/kubernetes.io~projected/kube-api-access-zh5cs DeviceMajor:0 DeviceMinor:370 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-591 DeviceMajor:0 DeviceMinor:591 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-884 DeviceMajor:0 DeviceMinor:884 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-163 DeviceMajor:0 DeviceMinor:163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:442 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e47d3935-d96e-4fd2-abfc-99bff7eac5e0/volumes/kubernetes.io~projected/kube-api-access-t85jj DeviceMajor:0 DeviceMinor:951 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:656 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af/userdata/shm DeviceMajor:0 DeviceMinor:479 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-506 DeviceMajor:0 DeviceMinor:506 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-378 DeviceMajor:0 DeviceMinor:378 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-730 DeviceMajor:0 DeviceMinor:730 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-524 DeviceMajor:0 DeviceMinor:524 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1188 DeviceMajor:0 DeviceMinor:1188 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/367ea4e8-8172-4730-8f26-499ed7c167bb/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5bea1d036c9faad13c1d14003487235e6d8ac60afd384e78842d83618960ce00/userdata/shm DeviceMajor:0 DeviceMinor:529 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c/userdata/shm DeviceMajor:0 DeviceMinor:1075 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1129 DeviceMajor:0 DeviceMinor:1129 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1000 DeviceMajor:0 DeviceMinor:1000 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/eb274dd5-e98e-4485-a050-7bec560a0daa/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:821 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166/userdata/shm DeviceMajor:0 DeviceMinor:100 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f3c0234-248a-4d6f-9aca-6582a481a911/volumes/kubernetes.io~projected/kube-api-access-8cvpt DeviceMajor:0 DeviceMinor:257 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c/userdata/shm DeviceMajor:0 DeviceMinor:464 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4d625e92-793e-4942-a826-d30d21e3ca0c/volumes/kubernetes.io~projected/kube-api-access-gczvq DeviceMajor:0 DeviceMinor:450 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-164 DeviceMajor:0 DeviceMinor:164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-893 DeviceMajor:0 DeviceMinor:893 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~projected/kube-api-access-79fmb DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/07eef031-edcc-415c-91d7-f9f3bd0a45e3/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:556 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-823 DeviceMajor:0 DeviceMinor:823 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-825 DeviceMajor:0 DeviceMinor:825 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8/userdata/shm DeviceMajor:0 DeviceMinor:240 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-869 DeviceMajor:0 DeviceMinor:869 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-959 DeviceMajor:0 DeviceMinor:959 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1067 DeviceMajor:0 DeviceMinor:1067 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-508 DeviceMajor:0 DeviceMinor:508 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea/userdata/shm DeviceMajor:0 DeviceMinor:937 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7af28c70-3ee0-438b-8655-95bc57deaa05/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:493 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-717 DeviceMajor:0 DeviceMinor:717 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a49a8a5e-23fb-46fb-b184-e09da4817028/volumes/kubernetes.io~projected/kube-api-access-k5z4r DeviceMajor:0 DeviceMinor:1062 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1069 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ca74427-9df7-44ca-ae91-0162f402644b/volumes/kubernetes.io~projected/kube-api-access-69pp5 DeviceMajor:0 DeviceMinor:1104 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b92c4c09-a796-4e00-821a-32fc9c8ca214/volumes/kubernetes.io~projected/kube-api-access-6h6qg DeviceMajor:0 DeviceMinor:1074 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/07804be1-d37c-474e-a179-01ac541d9705/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/84e5577e-df8b-46a2-aff7-0bf90b5010d9/volumes/kubernetes.io~projected/kube-api-access-k9vzl DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1/userdata/shm DeviceMajor:0 DeviceMinor:471 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f/userdata/shm DeviceMajor:0 DeviceMinor:587 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a5d009a3-5ff2-49f9-9cea-004b99729b77/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:814 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084/userdata/shm DeviceMajor:0 DeviceMinor:1063 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c479beae-6d34-4d49-81c8-39f6a53ff6ca/volumes/kubernetes.io~projected/kube-api-access-xxm2c DeviceMajor:0 DeviceMinor:112 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-413 DeviceMajor:0 DeviceMinor:413 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27bfc2e7edbcc6c9da4d8e8c7defa74d634ff9325db83c99710558275c0105da/userdata/shm DeviceMajor:0 DeviceMinor:583 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/63e81c6a-b723-4c9a-b471-226fefa9c7a7/volumes/kubernetes.io~projected/kube-api-access-r97ml DeviceMajor:0 DeviceMinor:822 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1086 DeviceMajor:0 DeviceMinor:1086 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-354 DeviceMajor:0 DeviceMinor:354 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6/volumes/kubernetes.io~projected/kube-api-access-f7dp9 DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-447 DeviceMajor:0 DeviceMinor:447 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-607 DeviceMajor:0 DeviceMinor:607 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fed06e56f643611f5a7965ed4f8766bb3618f49ec311e6c29d1c5641de5af6fe/userdata/shm DeviceMajor:0 DeviceMinor:689 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-697 DeviceMajor:0 DeviceMinor:697 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-67 DeviceMajor:0 DeviceMinor:67 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f65be54c-d851-4daa-93a0-8a3947da5e26/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:326 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1038 DeviceMajor:0 DeviceMinor:1038 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-380 DeviceMajor:0 DeviceMinor:380 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-302 DeviceMajor:0 DeviceMinor:302 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:561 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes/kubernetes.io~projected/kube-api-access-nwmkf DeviceMajor:0 DeviceMinor:528 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/46108693-3c9e-4804-beb0-d6b8f8df7625/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:744 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d263ddac-1ede-474f-b3bb-1f1f4f7846a3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/39f479b3-45ff-43f9-ae85-083554a54918/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:463 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d318202a66572196e0681affd0ab554e8fa6ff748647c2d4a3aedba6fee2ec96/userdata/shm DeviceMajor:0 DeviceMinor:770 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-331 DeviceMajor:0 DeviceMinor:331 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1125 DeviceMajor:0 DeviceMinor:1125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1079 DeviceMajor:0 DeviceMinor:1079 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/99ffcc9a-4b01-439a-93e0-0674bdfd32ec/volumes/kubernetes.io~projected/kube-api-access-jd875 DeviceMajor:0 DeviceMinor:266 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855/userdata/shm DeviceMajor:0 DeviceMinor:368 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9dd3eeed986a63782457294b98f428fdf5ac5bd98b40cab991676824cbd680b6/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:830 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f5148a50-4332-4ab4-af63-449ef7f16333/volumes/kubernetes.io~projected/kube-api-access-p9zjb DeviceMajor:0 DeviceMinor:490 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:440 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-618 DeviceMajor:0 DeviceMinor:618 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-386 DeviceMajor:0 DeviceMinor:386 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-512 DeviceMajor:0 DeviceMinor:512 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-626 DeviceMajor:0 DeviceMinor:626 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442/userdata/shm DeviceMajor:0 DeviceMinor:857 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1145 DeviceMajor:0 DeviceMinor:1145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-169 DeviceMajor:0 DeviceMinor:169 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-550 DeviceMajor:0 DeviceMinor:550 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-400 DeviceMajor:0 DeviceMinor:400 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:831 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1127 DeviceMajor:0 DeviceMinor:1127 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/aab47a39-a18b-4350-b141-4d5de2d8e96f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-343 DeviceMajor:0 DeviceMinor:343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24bd6a9b-9934-4cf9-b591-1ab892c8014c/volumes/kubernetes.io~projected/kube-api-access-jd74j DeviceMajor:0 DeviceMinor:434 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-949 DeviceMajor:0 DeviceMinor:949 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1131 DeviceMajor:0 DeviceMinor:1131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1010 DeviceMajor:0 DeviceMinor:1010 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/514e629483d9ce3a471489bf8ebd7e39de0e8072f7a78d8b93a7ef5b4897dfbc/userdata/shm DeviceMajor:0 DeviceMinor:544 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-284 DeviceMajor:0 DeviceMinor:284 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b8b47e7cae81e2e8fc5d1fdd94e1150be80c99269a023622d4d29934167c52d2/userdata/shm DeviceMajor:0 DeviceMinor:475 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-424 DeviceMajor:0 DeviceMinor:424 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/131e2573-ef74-4963-bdcf-901310e7a486/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:461 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-504 DeviceMajor:0 DeviceMinor:504 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0f2278c1-05cf-469c-a13c-762b7a790b38/volumes/kubernetes.io~projected/kube-api-access-rztqj DeviceMajor:0 DeviceMinor:835 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/09282e98-3140-41c9-863f-67e0e9f942cc/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1096 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780/userdata/shm DeviceMajor:0 DeviceMinor:474 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d34c10b1c43979eaba12a772c86e5b04cdb550f06f5159184d7c1b781b1802f8/userdata/shm DeviceMajor:0 DeviceMinor:840 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-939 DeviceMajor:0 DeviceMinor:939 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-190 DeviceMajor:0 DeviceMinor:190 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a/userdata/shm DeviceMajor:0 DeviceMinor:469 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-758 DeviceMajor:0 DeviceMinor:758 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-336 DeviceMajor:0 DeviceMinor:336 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241/userdata/shm DeviceMajor:0 DeviceMinor:679 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/bcc25a4a-521a-4139-a10c-d5b8bba6e359/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:842 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/51e2c1d1-a50f-4a0d-8273-440e699b3907/volumes/kubernetes.io~projected/kube-api-access-j8mds DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:456 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6/userdata/shm DeviceMajor:0 DeviceMinor:648 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/78d3b2df-3d81-413e-b039-80dc20111eae/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:457 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1160 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-384 DeviceMajor:0 DeviceMinor:384 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0c58fd5ea7dfc48 MacAddress:0e:36:cf:bd:cb:61 Speed:10000 Mtu:8900} {Name:107d14368724098 MacAddress:36:d0:7a:5a:a8:f0 Speed:10000 Mtu:8900} {Name:153273eefe31c48 MacAddress:e6:e4:95:17:0b:db Speed:10000 Mtu:8900} {Name:28c9e269179e8f2 MacAddress:be:f0:95:e6:83:08 Speed:10000 Mtu:8900} {Name:2ccf8a3de869529 MacAddress:b2:4e:7e:bc:f2:82 Speed:10000 Mtu:8900} {Name:2eac945a64b0099 MacAddress:02:e0:7e:08:37:65 Speed:10000 Mtu:8900} {Name:3e041b219895e4a MacAddress:8a:4a:d7:f2:ac:8b Speed:10000 Mtu:8900} {Name:40169c23a8e39df MacAddress:ae:91:eb:a1:34:28 Speed:10000 Mtu:8900} {Name:414a48d3cf2977f MacAddress:0a:ed:7d:56:ac:31 Speed:10000 Mtu:8900} {Name:444bcf401c29be2 MacAddress:56:fb:d5:dc:5f:40 Speed:10000 Mtu:8900} {Name:4cea9c5260823fb MacAddress:fa:f0:04:57:72:5f Speed:10000 Mtu:8900} {Name:4daf44a10b1f8c7 MacAddress:46:9a:61:35:b1:16 Speed:10000 Mtu:8900} {Name:514e629483d9ce3 MacAddress:6e:74:98:2d:64:84 Speed:10000 Mtu:8900} {Name:5bea1d036c9faad MacAddress:c6:f9:ee:b5:bf:7e Speed:10000 Mtu:8900} {Name:6b5887fbc5e32eb MacAddress:5e:bd:db:5a:03:60 Speed:10000 Mtu:8900} {Name:6f9d7a4e397f586 MacAddress:9a:48:d6:fb:cc:f5 Speed:10000 Mtu:8900} {Name:731fd6a0a7a380a MacAddress:4a:05:90:02:e4:c6 Speed:10000 Mtu:8900} {Name:7478a36d7c10b34 MacAddress:0e:20:8f:2a:0c:c8 Speed:10000 Mtu:8900} {Name:768bbdfdb22a3fa MacAddress:c6:02:b0:f7:a5:e5 Speed:10000 Mtu:8900} {Name:76d682fd002c893 MacAddress:aa:44:1c:79:8c:90 Speed:10000 Mtu:8900} {Name:781654899d52ad0 MacAddress:b6:ed:81:c6:6f:6b Speed:10000 Mtu:8900} {Name:7ccb1bded92273b MacAddress:0a:0f:40:62:4d:00 Speed:10000 Mtu:8900} {Name:854a387c5ef7958 MacAddress:b6:c4:1b:a5:a7:a9 Speed:10000 Mtu:8900} {Name:8db1413c53086a5 MacAddress:32:27:16:c2:55:c0 Speed:10000 Mtu:8900} {Name:9246f0557504641 MacAddress:2a:bd:55:61:2b:62 Speed:10000 Mtu:8900} {Name:9bc0a922f5672d1 MacAddress:a6:e7:1b:fc:29:4b Speed:10000 Mtu:8900} {Name:9dd3eeed986a637 MacAddress:f2:56:cd:66:6b:bc Speed:10000 Mtu:8900} {Name:9f7852cce265f47 MacAddress:ae:9c:7a:da:9b:84 Speed:10000 Mtu:8900} {Name:a297c2fb1d857d0 MacAddress:06:dc:f9:f6:3a:1e Speed:10000 Mtu:8900} {Name:a77515f16c17046 MacAddress:7e:ed:a8:45:86:f7 Speed:10000 Mtu:8900} {Name:a938fec278d4031 MacAddress:d2:77:9f:f5:db:d7 Speed:10000 Mtu:8900} {Name:a99d279c1652e2b MacAddress:86:26:d1:8d:b6:63 Speed:10000 Mtu:8900} {Name:b179219ef24884b MacAddress:da:62:64:19:6d:13 Speed:10000 Mtu:8900} {Name:b1eb508b26c0e3a MacAddress:5e:6e:7a:1c:8c:9b Speed:10000 Mtu:8900} {Name:b8b47e7cae81e2e MacAddress:e6:b5:a1:41:0a:f7 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:f2:de:5d:0a:11:40 Speed:0 Mtu:8900} {Name:c30aed6efb2687e MacAddress:62:76:48:7f:b8:66 Speed:10000 Mtu:8900} {Name:c88934bb050c8e7 MacAddress:d6:8d:4e:7d:dd:e2 Speed:10000 Mtu:8900} {Name:cad025d780875a4 MacAddress:6a:fb:86:f2:a4:2e Speed:10000 Mtu:8900} {Name:cd89b54332e2565 MacAddress:8e:a7:a6:04:26:dc Speed:10000 Mtu:8900} {Name:ce92bb2deaa819f MacAddress:8a:52:de:c9:e9:ec Speed:10000 Mtu:8900} {Name:d0c18b672eeccf5 MacAddress:56:70:41:75:96:db Speed:10000 Mtu:8900} {Name:d318202a6657219 MacAddress:1a:8d:79:13:ef:17 Speed:10000 Mtu:8900} {Name:d34c10b1c43979e MacAddress:72:b2:1b:e2:e2:23 Speed:10000 Mtu:8900} {Name:d405214a0101e6f MacAddress:1a:7b:17:c1:7c:1a Speed:10000 Mtu:8900} {Name:d8f0e6a1ec121bb MacAddress:86:3c:51:f2:fc:6c Speed:10000 Mtu:8900} {Name:da05a78204256cc MacAddress:8e:68:da:cd:35:95 Speed:10000 Mtu:8900} {Name:e4eb5ff21caa542 MacAddress:f2:1e:81:ab:2a:52 Speed:10000 Mtu:8900} {Name:e5b8532c133a52b MacAddress:9a:d0:b4:36:a8:e5 Speed:10000 Mtu:8900} {Name:ea9e9d1680b5aa8 MacAddress:96:26:54:bc:29:53 Speed:10000 Mtu:8900} {Name:ef597ecc3cb239a MacAddress:a2:5e:ce:c3:f4:46 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:a8:4b:cf Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:ea:ac:db Speed:-1 Mtu:9000} {Name:f233927f2993af7 MacAddress:4e:04:77:20:d1:55 Speed:10000 Mtu:8900} {Name:f66b3d91451352d MacAddress:8e:db:73:e8:d5:76 Speed:10000 Mtu:8900} {Name:f69cbe55fe655a8 MacAddress:d2:b6:07:b1:6f:3d Speed:10000 Mtu:8900} {Name:fb94aad616f605f MacAddress:82:7e:8f:6f:77:96 Speed:10000 Mtu:8900} {Name:fc7608745c820a5 MacAddress:16:27:30:38:ba:65 Speed:10000 Mtu:8900} {Name:fed06e56f643611 MacAddress:26:b1:ab:9b:3c:9a Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:4a:5a:35:ca:89:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 12:09:52.816474 master-0 kubenswrapper[29612]: I0319 12:09:52.815938 29612 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 12:09:52.816474 master-0 kubenswrapper[29612]: I0319 12:09:52.816083 29612 manager.go:233] Version: {KernelVersion:5.14.0-427.113.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202603021444-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 12:09:52.816474 master-0 kubenswrapper[29612]: I0319 12:09:52.816395 29612 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 12:09:52.816950 master-0 kubenswrapper[29612]: I0319 12:09:52.816621 29612 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 12:09:52.817508 master-0 kubenswrapper[29612]: I0319 12:09:52.816692 29612 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 12:09:52.817578 master-0 kubenswrapper[29612]: I0319 12:09:52.817566 29612 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 12:09:52.817617 master-0 kubenswrapper[29612]: I0319 12:09:52.817584 29612 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 12:09:52.817617 master-0 kubenswrapper[29612]: I0319 12:09:52.817596 29612 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 12:09:52.817692 master-0 kubenswrapper[29612]: I0319 12:09:52.817626 29612 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 12:09:52.817692 master-0 kubenswrapper[29612]: I0319 12:09:52.817687 29612 state_mem.go:36] "Initialized new in-memory state store" Mar 19 12:09:52.817803 master-0 kubenswrapper[29612]: I0319 12:09:52.817783 29612 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 12:09:52.817882 master-0 kubenswrapper[29612]: I0319 12:09:52.817865 29612 kubelet.go:418] "Attempting to sync node with API server" Mar 19 12:09:52.817944 master-0 kubenswrapper[29612]: I0319 12:09:52.817887 29612 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 12:09:52.817944 master-0 kubenswrapper[29612]: I0319 12:09:52.817904 29612 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 12:09:52.817944 master-0 kubenswrapper[29612]: I0319 12:09:52.817919 29612 kubelet.go:324] "Adding apiserver pod source" Mar 19 12:09:52.817944 master-0 kubenswrapper[29612]: I0319 12:09:52.817940 29612 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 12:09:52.819354 master-0 kubenswrapper[29612]: I0319 12:09:52.819312 29612 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 19 12:09:52.819830 master-0 kubenswrapper[29612]: I0319 12:09:52.819802 29612 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 12:09:52.820167 master-0 kubenswrapper[29612]: I0319 12:09:52.820140 29612 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 12:09:52.820336 master-0 kubenswrapper[29612]: I0319 12:09:52.820306 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 12:09:52.820395 master-0 kubenswrapper[29612]: I0319 12:09:52.820343 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 12:09:52.820395 master-0 kubenswrapper[29612]: I0319 12:09:52.820356 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 12:09:52.820395 master-0 kubenswrapper[29612]: I0319 12:09:52.820366 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 12:09:52.820395 master-0 kubenswrapper[29612]: I0319 12:09:52.820376 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 12:09:52.820395 master-0 kubenswrapper[29612]: I0319 12:09:52.820386 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 12:09:52.820395 master-0 kubenswrapper[29612]: I0319 12:09:52.820398 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 12:09:52.820666 master-0 kubenswrapper[29612]: I0319 12:09:52.820409 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 12:09:52.820666 master-0 kubenswrapper[29612]: I0319 12:09:52.820422 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 12:09:52.820666 master-0 kubenswrapper[29612]: I0319 12:09:52.820432 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 12:09:52.820666 master-0 kubenswrapper[29612]: I0319 12:09:52.820453 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 12:09:52.820666 master-0 kubenswrapper[29612]: I0319 12:09:52.820471 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 12:09:52.820666 master-0 kubenswrapper[29612]: I0319 12:09:52.820525 29612 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 12:09:52.821488 master-0 kubenswrapper[29612]: I0319 12:09:52.821465 29612 server.go:1280] "Started kubelet" Mar 19 12:09:52.822114 master-0 kubenswrapper[29612]: I0319 12:09:52.822061 29612 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 12:09:52.822235 master-0 kubenswrapper[29612]: I0319 12:09:52.822084 29612 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 12:09:52.822235 master-0 kubenswrapper[29612]: I0319 12:09:52.822166 29612 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 19 12:09:52.822592 master-0 kubenswrapper[29612]: I0319 12:09:52.822564 29612 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 12:09:52.824224 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 19 12:09:52.824709 master-0 kubenswrapper[29612]: I0319 12:09:52.824221 29612 server.go:449] "Adding debug handlers to kubelet server" Mar 19 12:09:52.844803 master-0 kubenswrapper[29612]: I0319 12:09:52.844770 29612 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 12:09:52.846513 master-0 kubenswrapper[29612]: I0319 12:09:52.846461 29612 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 12:09:52.859159 master-0 kubenswrapper[29612]: I0319 12:09:52.859120 29612 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 12:09:52.859382 master-0 kubenswrapper[29612]: I0319 12:09:52.859372 29612 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 12:09:52.862205 master-0 kubenswrapper[29612]: I0319 12:09:52.862117 29612 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-20 11:43:57 +0000 UTC, rotation deadline is 2026-03-20 08:54:03.502663044 +0000 UTC Mar 19 12:09:52.874767 master-0 kubenswrapper[29612]: I0319 12:09:52.862400 29612 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 19 12:09:52.874767 master-0 kubenswrapper[29612]: I0319 12:09:52.861909 29612 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 12:09:52.875105 master-0 kubenswrapper[29612]: I0319 12:09:52.874796 29612 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 12:09:52.875105 master-0 kubenswrapper[29612]: I0319 12:09:52.863692 29612 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 12:09:52.875226 master-0 kubenswrapper[29612]: I0319 12:09:52.874743 29612 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h44m10.62794006s for next certificate rotation Mar 19 12:09:52.875277 master-0 kubenswrapper[29612]: I0319 12:09:52.864290 29612 factory.go:55] Registering systemd factory Mar 19 12:09:52.875342 master-0 kubenswrapper[29612]: I0319 12:09:52.875333 29612 factory.go:221] Registration of the systemd container factory successfully Mar 19 12:09:52.877985 master-0 kubenswrapper[29612]: I0319 12:09:52.877872 29612 factory.go:153] Registering CRI-O factory Mar 19 12:09:52.878043 master-0 kubenswrapper[29612]: I0319 12:09:52.877988 29612 factory.go:221] Registration of the crio container factory successfully Mar 19 12:09:52.878212 master-0 kubenswrapper[29612]: I0319 12:09:52.878184 29612 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 12:09:52.878262 master-0 kubenswrapper[29612]: I0319 12:09:52.878222 29612 factory.go:103] Registering Raw factory Mar 19 12:09:52.878262 master-0 kubenswrapper[29612]: I0319 12:09:52.878247 29612 manager.go:1196] Started watching for new ooms in manager Mar 19 12:09:52.882185 master-0 kubenswrapper[29612]: I0319 12:09:52.882144 29612 manager.go:319] Starting recovery of all containers Mar 19 12:09:52.883332 master-0 kubenswrapper[29612]: E0319 12:09:52.883296 29612 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 19 12:09:52.890280 master-0 kubenswrapper[29612]: I0319 12:09:52.890212 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-policies" seLinuxMountContext="" Mar 19 12:09:52.890280 master-0 kubenswrapper[29612]: I0319 12:09:52.890278 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84e5577e-df8b-46a2-aff7-0bf90b5010d9" volumeName="kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890296 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a49a8a5e-23fb-46fb-b184-e09da4817028" volumeName="kubernetes.io/projected/a49a8a5e-23fb-46fb-b184-e09da4817028-kube-api-access-k5z4r" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890311 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b0133cb5-e425-40af-a051-71b2362a89c3" volumeName="kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890324 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d52981e0-0bc4-4dfa-81d5-aeab63fa23e0" volumeName="kubernetes.io/projected/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-kube-api-access-rvp77" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890336 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64136215-8539-4349-977f-6d59deb132ea" volumeName="kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890353 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f2d1d03-7427-45a0-a917-173bf9df9902" volumeName="kubernetes.io/empty-dir/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-textfile" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890363 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e8c4676-05af-4478-a44b-1bdcbefdea0f" volumeName="kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890388 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" volumeName="kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890398 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="131e2573-ef74-4963-bdcf-901310e7a486" volumeName="kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890407 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d625e92-793e-4942-a826-d30d21e3ca0c" volumeName="kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-utilities" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890416 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890424 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890435 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0c804ea0-3483-4506-b138-3bf30016d8d8" volumeName="kubernetes.io/projected/0c804ea0-3483-4506-b138-3bf30016d8d8-kube-api-access-pxjkt" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890445 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" volumeName="kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890453 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6" volumeName="kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890462 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a49a8a5e-23fb-46fb-b184-e09da4817028" volumeName="kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890471 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="003e840c-1b07-4624-8225-e4ffdab74213" volumeName="kubernetes.io/projected/003e840c-1b07-4624-8225-e4ffdab74213-kube-api-access-hmbbm" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890496 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d3b2df-3d81-413e-b039-80dc20111eae" volumeName="kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890508 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-image-import-ca" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890516 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58a4e3af-c5e1-4487-92a2-81dfb696695e" volumeName="kubernetes.io/empty-dir/58a4e3af-c5e1-4487-92a2-81dfb696695e-cache" seLinuxMountContext="" Mar 19 12:09:52.890500 master-0 kubenswrapper[29612]: I0319 12:09:52.890525 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60081e3b-4fe5-42a3-bfae-0b07853c0e36" volumeName="kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-kube-api-access-ctghm" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890534 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a49a8a5e-23fb-46fb-b184-e09da4817028" volumeName="kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890543 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac946049-4b93-4d8c-bfa6-eacf83160ed1" volumeName="kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890551 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b92c4c09-a796-4e00-821a-32fc9c8ca214" volumeName="kubernetes.io/projected/b92c4c09-a796-4e00-821a-32fc9c8ca214-kube-api-access-6h6qg" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890560 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bea42722-d7af-4938-9f3a-da57bd056f96" volumeName="kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890570 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d979be56-f948-4252-ab6b-9bfc48dc4187" volumeName="kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-utilities" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890580 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890591 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46108693-3c9e-4804-beb0-d6b8f8df7625" volumeName="kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-stats-auth" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890599 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a" volumeName="kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890617 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f3c0234-248a-4d6f-9aca-6582a481a911" volumeName="kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890627 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367ea4e8-8172-4730-8f26-499ed7c167bb" volumeName="kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890652 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-encryption-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890661 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890671 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890680 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890689 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0fed1dcc-6fc0-41ef-8236-e095dca4ddcc" volumeName="kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890698 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57d19c68-b469-4a6f-abe0-3a0eff32639c" volumeName="kubernetes.io/projected/57d19c68-b469-4a6f-abe0-3a0eff32639c-kube-api-access-v8qjv" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890706 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="63e81c6a-b723-4c9a-b471-226fefa9c7a7" volumeName="kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890714 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-trusted-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890723 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5cf6404-1640-43d3-8018-03c62c8338c5" volumeName="kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890731 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4705c8f3-89d4-42f2-ad14-e591c5380b9e" volumeName="kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890740 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b4a2e47-e85a-4474-a1b7-3eb85283dfcf" volumeName="kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890749 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367ea4e8-8172-4730-8f26-499ed7c167bb" volumeName="kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890758 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07eef031-edcc-415c-91d7-f9f3bd0a45e3" volumeName="kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-tuned" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890768 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890776 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a" volumeName="kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890785 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890794 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890803 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="003e840c-1b07-4624-8225-e4ffdab74213" volumeName="kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-catalog-content" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890812 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46108693-3c9e-4804-beb0-d6b8f8df7625" volumeName="kubernetes.io/projected/46108693-3c9e-4804-beb0-d6b8f8df7625-kube-api-access-wgsm4" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890820 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60081e3b-4fe5-42a3-bfae-0b07853c0e36" volumeName="kubernetes.io/empty-dir/60081e3b-4fe5-42a3-bfae-0b07853c0e36-cache" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890832 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60081e3b-4fe5-42a3-bfae-0b07853c0e36" volumeName="kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-ca-certs" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890841 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="65b629a3-ea10-4517-b762-b18b22fab2a9" volumeName="kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890852 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" volumeName="kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890862 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b46c6b4-f701-4a7f-bf74-90afe14ad89a" volumeName="kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890872 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-trusted-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890881 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890889 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f5148a50-4332-4ab4-af63-449ef7f16333" volumeName="kubernetes.io/projected/f5148a50-4332-4ab4-af63-449ef7f16333-kube-api-access-p9zjb" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890897 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890906 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24bd6a9b-9934-4cf9-b591-1ab892c8014c" volumeName="kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-utilities" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890916 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="367ea4e8-8172-4730-8f26-499ed7c167bb" volumeName="kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890924 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65be54c-d851-4daa-93a0-8a3947da5e26" volumeName="kubernetes.io/projected/f65be54c-d851-4daa-93a0-8a3947da5e26-kube-api-access" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890931 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" volumeName="kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890940 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890948 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890958 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" volumeName="kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890968 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890976 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bea42722-d7af-4938-9f3a-da57bd056f96" volumeName="kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890984 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d979be56-f948-4252-ab6b-9bfc48dc4187" volumeName="kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-catalog-content" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.890993 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891001 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb274dd5-e98e-4485-a050-7bec560a0daa" volumeName="kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891009 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3e1b71a7-c9fd-4f03-8d0c-f092dac80989" volumeName="kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891018 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcc25a4a-521a-4139-a10c-d5b8bba6e359" volumeName="kubernetes.io/projected/bcc25a4a-521a-4139-a10c-d5b8bba6e359-kube-api-access-2vhrb" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891027 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbde1a69-b56c-467e-a291-9a83c448346f" volumeName="kubernetes.io/projected/fbde1a69-b56c-467e-a291-9a83c448346f-kube-api-access-j5f65" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891035 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94fa5656-0561-45ab-9ab1-a6b96b8a47d4" volumeName="kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891044 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0fed1dcc-6fc0-41ef-8236-e095dca4ddcc" volumeName="kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891053 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6" volumeName="kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891061 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d017409-a362-4e58-87af-d02b3834fdd4" volumeName="kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891069 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d3b2df-3d81-413e-b039-80dc20111eae" volumeName="kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891079 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b4a2e47-e85a-4474-a1b7-3eb85283dfcf" volumeName="kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891087 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b4a2e47-e85a-4474-a1b7-3eb85283dfcf" volumeName="kubernetes.io/empty-dir/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-snapshots" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891096 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b69a314d-4884-4f2c-a48e-1cf68c5aaef4" volumeName="kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891106 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07eef031-edcc-415c-91d7-f9f3bd0a45e3" volumeName="kubernetes.io/projected/07eef031-edcc-415c-91d7-f9f3bd0a45e3-kube-api-access-6frf6" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891115 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d625e92-793e-4942-a826-d30d21e3ca0c" volumeName="kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-catalog-content" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891124 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e7b04dc-e85d-41b8-959c-87b1b24731a1" volumeName="kubernetes.io/secret/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-key" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891134 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d52981e0-0bc4-4dfa-81d5-aeab63fa23e0" volumeName="kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891147 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" volumeName="kubernetes.io/projected/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-kube-api-access-t85jj" seLinuxMountContext="" Mar 19 12:09:52.891146 master-0 kubenswrapper[29612]: I0319 12:09:52.891203 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09282e98-3140-41c9-863f-67e0e9f942cc" volumeName="kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891217 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891226 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-client" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891235 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbde1a69-b56c-467e-a291-9a83c448346f" volumeName="kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891244 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="003e840c-1b07-4624-8225-e4ffdab74213" volumeName="kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-utilities" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891283 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" volumeName="kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891291 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891300 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-serving-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891310 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcc25a4a-521a-4139-a10c-d5b8bba6e359" volumeName="kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891319 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891354 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46108693-3c9e-4804-beb0-d6b8f8df7625" volumeName="kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-default-certificate" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891366 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891377 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ca74427-9df7-44ca-ae91-0162f402644b" volumeName="kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891455 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ca74427-9df7-44ca-ae91-0162f402644b" volumeName="kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891474 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ca74427-9df7-44ca-ae91-0162f402644b" volumeName="kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891485 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891528 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac946049-4b93-4d8c-bfa6-eacf83160ed1" volumeName="kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891542 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891552 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d625e92-793e-4942-a826-d30d21e3ca0c" volumeName="kubernetes.io/projected/4d625e92-793e-4942-a826-d30d21e3ca0c-kube-api-access-gczvq" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891562 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac946049-4b93-4d8c-bfa6-eacf83160ed1" volumeName="kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891572 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b92c4c09-a796-4e00-821a-32fc9c8ca214" volumeName="kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891606 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bea42722-d7af-4938-9f3a-da57bd056f96" volumeName="kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891620 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f3c0234-248a-4d6f-9aca-6582a481a911" volumeName="kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891658 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="131e2573-ef74-4963-bdcf-901310e7a486" volumeName="kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891673 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b69a314d-4884-4f2c-a48e-1cf68c5aaef4" volumeName="kubernetes.io/empty-dir/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-tmpfs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891682 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0fed1dcc-6fc0-41ef-8236-e095dca4ddcc" volumeName="kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891691 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" volumeName="kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891699 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0f2278c1-05cf-469c-a13c-762b7a790b38" volumeName="kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891708 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24bd6a9b-9934-4cf9-b591-1ab892c8014c" volumeName="kubernetes.io/projected/24bd6a9b-9934-4cf9-b591-1ab892c8014c-kube-api-access-jd74j" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891749 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/projected/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-kube-api-access-8ntj9" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891759 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="819c8a76-019b-4124-a67f-fb5838cbec5d" volumeName="kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891769 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b4a2e47-e85a-4474-a1b7-3eb85283dfcf" volumeName="kubernetes.io/projected/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-kube-api-access-x9g5b" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891778 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0" volumeName="kubernetes.io/projected/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-kube-api-access-rn94k" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891787 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5cf6404-1640-43d3-8018-03c62c8338c5" volumeName="kubernetes.io/projected/a5cf6404-1640-43d3-8018-03c62c8338c5-kube-api-access-b48q2" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891817 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" volumeName="kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891827 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb274dd5-e98e-4485-a050-7bec560a0daa" volumeName="kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891837 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65be54c-d851-4daa-93a0-8a3947da5e26" volumeName="kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891845 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d017409-a362-4e58-87af-d02b3834fdd4" volumeName="kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891854 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="819c8a76-019b-4124-a67f-fb5838cbec5d" volumeName="kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891861 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891869 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e7b04dc-e85d-41b8-959c-87b1b24731a1" volumeName="kubernetes.io/projected/4e7b04dc-e85d-41b8-959c-87b1b24731a1-kube-api-access-9h4cz" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891898 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" volumeName="kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891910 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65be54c-d851-4daa-93a0-8a3947da5e26" volumeName="kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891919 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f93ee7db-f0ed-470f-a458-86c54e6d064a" volumeName="kubernetes.io/projected/f93ee7db-f0ed-470f-a458-86c54e6d064a-kube-api-access-jstrq" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891928 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891936 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b69a314d-4884-4f2c-a48e-1cf68c5aaef4" volumeName="kubernetes.io/projected/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-kube-api-access-8765q" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.891945 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892059 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="63e81c6a-b723-4c9a-b471-226fefa9c7a7" volumeName="kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892069 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892077 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0f2278c1-05cf-469c-a13c-762b7a790b38" volumeName="kubernetes.io/projected/0f2278c1-05cf-469c-a13c-762b7a790b38-kube-api-access-rztqj" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892085 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4705c8f3-89d4-42f2-ad14-e591c5380b9e" volumeName="kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892092 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d3b2df-3d81-413e-b039-80dc20111eae" volumeName="kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892102 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c479beae-6d34-4d49-81c8-39f6a53ff6ca" volumeName="kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892111 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a185e40-e97b-49e9-aea3-5f170401a9e3" volumeName="kubernetes.io/projected/3a185e40-e97b-49e9-aea3-5f170401a9e3-kube-api-access-2v4xb" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892149 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892158 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ca74427-9df7-44ca-ae91-0162f402644b" volumeName="kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892167 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892175 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" volumeName="kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892184 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892193 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f479b3-45ff-43f9-ae85-083554a54918" volumeName="kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892220 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="131e2573-ef74-4963-bdcf-901310e7a486" volumeName="kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892231 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5d009a3-5ff2-49f9-9cea-004b99729b77" volumeName="kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892239 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892247 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09282e98-3140-41c9-863f-67e0e9f942cc" volumeName="kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892256 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58a4e3af-c5e1-4487-92a2-81dfb696695e" volumeName="kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-ca-certs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892264 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-serving-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892273 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33f6e54f-a750-4c91-b4f5-049275d33cfb" volumeName="kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892303 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57d19c68-b469-4a6f-abe0-3a0eff32639c" volumeName="kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892313 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f2d1d03-7427-45a0-a917-173bf9df9902" volumeName="kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892323 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a" volumeName="kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892330 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e617e515-a899-41e8-9961-df68e49da4d3" volumeName="kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892341 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892350 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" volumeName="kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892377 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892391 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="819c8a76-019b-4124-a67f-fb5838cbec5d" volumeName="kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892400 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84e5577e-df8b-46a2-aff7-0bf90b5010d9" volumeName="kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892410 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892419 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ca74427-9df7-44ca-ae91-0162f402644b" volumeName="kubernetes.io/empty-dir/4ca74427-9df7-44ca-ae91-0162f402644b-volume-directive-shadow" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892429 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f2d1d03-7427-45a0-a917-173bf9df9902" volumeName="kubernetes.io/projected/5f2d1d03-7427-45a0-a917-173bf9df9902-kube-api-access-z7cz7" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892437 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="33f6e54f-a750-4c91-b4f5-049275d33cfb" volumeName="kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892468 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b92c4c09-a796-4e00-821a-32fc9c8ca214" volumeName="kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892477 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0f2278c1-05cf-469c-a13c-762b7a790b38" volumeName="kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892486 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f479b3-45ff-43f9-ae85-083554a54918" volumeName="kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892495 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a" volumeName="kubernetes.io/projected/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-kube-api-access-v5cc5" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892504 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892514 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09282e98-3140-41c9-863f-67e0e9f942cc" volumeName="kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892527 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" volumeName="kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892539 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb274dd5-e98e-4485-a050-7bec560a0daa" volumeName="kubernetes.io/projected/eb274dd5-e98e-4485-a050-7bec560a0daa-kube-api-access-jsvpz" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892592 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64136215-8539-4349-977f-6d59deb132ea" volumeName="kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892607 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4705c8f3-89d4-42f2-ad14-e591c5380b9e" volumeName="kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892617 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b46c6b4-f701-4a7f-bf74-90afe14ad89a" volumeName="kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892626 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f2d1d03-7427-45a0-a917-173bf9df9902" volumeName="kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892722 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/projected/7af28c70-3ee0-438b-8655-95bc57deaa05-kube-api-access-2m4gg" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892733 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892742 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58a4e3af-c5e1-4487-92a2-81dfb696695e" volumeName="kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-kube-api-access-gjj2t" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892752 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58a4e3af-c5e1-4487-92a2-81dfb696695e" volumeName="kubernetes.io/secret/58a4e3af-c5e1-4487-92a2-81dfb696695e-catalogserver-certs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892762 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="819c8a76-019b-4124-a67f-fb5838cbec5d" volumeName="kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892770 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a185e40-e97b-49e9-aea3-5f170401a9e3" volumeName="kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892778 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e8c4676-05af-4478-a44b-1bdcbefdea0f" volumeName="kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892787 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99ffcc9a-4b01-439a-93e0-0674bdfd32ec" volumeName="kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892795 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24bd6a9b-9934-4cf9-b591-1ab892c8014c" volumeName="kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-catalog-content" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892804 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" volumeName="kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-encryption-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892812 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-audit" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892823 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" volumeName="kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892835 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f479b3-45ff-43f9-ae85-083554a54918" volumeName="kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892846 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892860 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892868 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" volumeName="kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892879 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892887 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09282e98-3140-41c9-863f-67e0e9f942cc" volumeName="kubernetes.io/projected/09282e98-3140-41c9-863f-67e0e9f942cc-kube-api-access-x9hnq" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892897 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6" volumeName="kubernetes.io/projected/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-kube-api-access-pxt2h" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892905 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d017409-a362-4e58-87af-d02b3834fdd4" volumeName="kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892947 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892955 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0" volumeName="kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892964 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94fa5656-0561-45ab-9ab1-a6b96b8a47d4" volumeName="kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892973 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5d009a3-5ff2-49f9-9cea-004b99729b77" volumeName="kubernetes.io/projected/a5d009a3-5ff2-49f9-9cea-004b99729b77-kube-api-access-z4khz" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892982 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b69a314d-4884-4f2c-a48e-1cf68c5aaef4" volumeName="kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892991 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="eb274dd5-e98e-4485-a050-7bec560a0daa" volumeName="kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.892999 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f38df702-a256-4f83-90bb-0c0e477a2ce6" volumeName="kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893008 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbde1a69-b56c-467e-a291-9a83c448346f" volumeName="kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893021 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f2d1d03-7427-45a0-a917-173bf9df9902" volumeName="kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893031 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46108693-3c9e-4804-beb0-d6b8f8df7625" volumeName="kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893042 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e7b04dc-e85d-41b8-959c-87b1b24731a1" volumeName="kubernetes.io/configmap/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-cabundle" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893054 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="63e81c6a-b723-4c9a-b471-226fefa9c7a7" volumeName="kubernetes.io/projected/63e81c6a-b723-4c9a-b471-226fefa9c7a7-kube-api-access-r97ml" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893066 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5cf6404-1640-43d3-8018-03c62c8338c5" volumeName="kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893076 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="162b513f-2f67-4946-816b-48f6232dfca0" volumeName="kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893087 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f3c0234-248a-4d6f-9aca-6582a481a911" volumeName="kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893098 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="819c8a76-019b-4124-a67f-fb5838cbec5d" volumeName="kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893628 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbde1a69-b56c-467e-a291-9a83c448346f" volumeName="kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893673 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893688 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" volumeName="kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893701 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b4a2e47-e85a-4474-a1b7-3eb85283dfcf" volumeName="kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893745 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07804be1-d37c-474e-a179-01ac541d9705" volumeName="kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893761 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="78d3b2df-3d81-413e-b039-80dc20111eae" volumeName="kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893772 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b92c4c09-a796-4e00-821a-32fc9c8ca214" volumeName="kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893781 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893792 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="07eef031-edcc-415c-91d7-f9f3bd0a45e3" volumeName="kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-tmp" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893824 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d790b42-2182-4b9c-8391-285e938715cc" volumeName="kubernetes.io/projected/4d790b42-2182-4b9c-8391-285e938715cc-kube-api-access-zh5cs" seLinuxMountContext="" Mar 19 12:09:52.893714 master-0 kubenswrapper[29612]: I0319 12:09:52.893859 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7af28c70-3ee0-438b-8655-95bc57deaa05" volumeName="kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-client" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893878 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893918 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b0133cb5-e425-40af-a051-71b2362a89c3" volumeName="kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893929 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46108693-3c9e-4804-beb0-d6b8f8df7625" volumeName="kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893939 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51e2c1d1-a50f-4a0d-8273-440e699b3907" volumeName="kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893948 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8117f6b0-0b9f-4a1d-9807-bf73c19f388b" volumeName="kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893956 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e90c10f6-36d9-44bd-a4c6-174f12135434" volumeName="kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893985 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4ca74427-9df7-44ca-ae91-0162f402644b" volumeName="kubernetes.io/projected/4ca74427-9df7-44ca-ae91-0162f402644b-kube-api-access-69pp5" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.893997 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84e5577e-df8b-46a2-aff7-0bf90b5010d9" volumeName="kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.894006 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0" volumeName="kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.894014 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aab47a39-a18b-4350-b141-4d5de2d8e96f" volumeName="kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.894022 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d979be56-f948-4252-ab6b-9bfc48dc4187" volumeName="kubernetes.io/projected/d979be56-f948-4252-ab6b-9bfc48dc4187-kube-api-access-w7bl2" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.894032 29612 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" volumeName="kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config" seLinuxMountContext="" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.894040 29612 reconstruct.go:97] "Volume reconstruction finished" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.894067 29612 reconciler.go:26] "Reconciler: start to sync state" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.896447 29612 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 12:09:52.900005 master-0 kubenswrapper[29612]: I0319 12:09:52.898327 29612 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 12:09:52.905316 master-0 kubenswrapper[29612]: I0319 12:09:52.905275 29612 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 12:09:52.905524 master-0 kubenswrapper[29612]: I0319 12:09:52.905327 29612 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 12:09:52.905524 master-0 kubenswrapper[29612]: I0319 12:09:52.905359 29612 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 12:09:52.905524 master-0 kubenswrapper[29612]: E0319 12:09:52.905416 29612 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 12:09:52.912951 master-0 kubenswrapper[29612]: I0319 12:09:52.912887 29612 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 12:09:52.921078 master-0 kubenswrapper[29612]: I0319 12:09:52.921000 29612 generic.go:334] "Generic (PLEG): container finished" podID="7af28c70-3ee0-438b-8655-95bc57deaa05" containerID="49a4f50ee477671b0b5a448101a3e5ff8df04706fa3923595f4f02465bdb07e9" exitCode=0 Mar 19 12:09:52.931234 master-0 kubenswrapper[29612]: I0319 12:09:52.931207 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_eb4c5b33-58c8-4b51-b71b-1befc165f2b1/installer/0.log" Mar 19 12:09:52.931483 master-0 kubenswrapper[29612]: I0319 12:09:52.931462 29612 generic.go:334] "Generic (PLEG): container finished" podID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerID="bff053145b8a60a46c397e36d3fff2a19f60020e8f6b7b60fcc48ee33442df16" exitCode=1 Mar 19 12:09:52.940451 master-0 kubenswrapper[29612]: I0319 12:09:52.940400 29612 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="63b90693b783d2344a0175569017503f87f560c1b75623dba968fbe7562bf39a" exitCode=0 Mar 19 12:09:52.940451 master-0 kubenswrapper[29612]: I0319 12:09:52.940446 29612 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="47183b768a7e3ff5da057dffa4e751f7869a36ae402360723f1788dfd4f7f643" exitCode=0 Mar 19 12:09:52.940451 master-0 kubenswrapper[29612]: I0319 12:09:52.940458 29612 generic.go:334] "Generic (PLEG): container finished" podID="50adfdd3-1d4d-47d2-a35b-cd028d66b4c6" containerID="76315438a4a762dabbb5d7e77ae1a8b1b6d345329e45b5b2c2feae26b7df9765" exitCode=0 Mar 19 12:09:52.943658 master-0 kubenswrapper[29612]: I0319 12:09:52.943607 29612 generic.go:334] "Generic (PLEG): container finished" podID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerID="246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d" exitCode=0 Mar 19 12:09:52.958515 master-0 kubenswrapper[29612]: I0319 12:09:52.958472 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 12:09:52.960027 master-0 kubenswrapper[29612]: I0319 12:09:52.959994 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="76d14e466053de21cd4b44b01526664874f27eff3ad2faef8933296d68c35040" exitCode=255 Mar 19 12:09:52.960093 master-0 kubenswrapper[29612]: I0319 12:09:52.960027 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363" exitCode=0 Mar 19 12:09:52.961610 master-0 kubenswrapper[29612]: I0319 12:09:52.961579 29612 generic.go:334] "Generic (PLEG): container finished" podID="eb274dd5-e98e-4485-a050-7bec560a0daa" containerID="ecee444cb43be9a1ddd0e2f81e61eca30a1f71fa25000dc8d0f2c58cd47a25c6" exitCode=0 Mar 19 12:09:52.964089 master-0 kubenswrapper[29612]: I0319 12:09:52.964069 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_8e27b7d086edf5d2cf47b703574641d8/kube-scheduler/0.log" Mar 19 12:09:52.964332 master-0 kubenswrapper[29612]: I0319 12:09:52.964309 29612 generic.go:334] "Generic (PLEG): container finished" podID="8e27b7d086edf5d2cf47b703574641d8" containerID="f043145314a3afe735f976ccbd7c9f8a4f3dcf66cb61565878dcf3c6a12fe880" exitCode=1 Mar 19 12:09:52.964332 master-0 kubenswrapper[29612]: I0319 12:09:52.964332 29612 generic.go:334] "Generic (PLEG): container finished" podID="8e27b7d086edf5d2cf47b703574641d8" containerID="820218d0107a773549d9d05e3670e6a7a1f29fee796eba888ebf85af0aa9baa7" exitCode=0 Mar 19 12:09:52.966218 master-0 kubenswrapper[29612]: I0319 12:09:52.966184 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-6f69995874-cwncj_e90c10f6-36d9-44bd-a4c6-174f12135434/cluster-baremetal-operator/1.log" Mar 19 12:09:52.966675 master-0 kubenswrapper[29612]: I0319 12:09:52.966649 29612 generic.go:334] "Generic (PLEG): container finished" podID="e90c10f6-36d9-44bd-a4c6-174f12135434" containerID="3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a" exitCode=1 Mar 19 12:09:52.980697 master-0 kubenswrapper[29612]: I0319 12:09:52.980644 29612 generic.go:334] "Generic (PLEG): container finished" podID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerID="711f8121453bce45bc824e3d6883cc554f357db4612fdf30ce1a48bb144904ff" exitCode=0 Mar 19 12:09:52.983835 master-0 kubenswrapper[29612]: I0319 12:09:52.983785 29612 generic.go:334] "Generic (PLEG): container finished" podID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerID="f7e4a7cf437941fedc829554051c6d66a149aed6e58ff2aaeede0de923f84868" exitCode=0 Mar 19 12:09:52.994215 master-0 kubenswrapper[29612]: I0319 12:09:52.994090 29612 generic.go:334] "Generic (PLEG): container finished" podID="e617e515-a899-41e8-9961-df68e49da4d3" containerID="ffac3bdf0b92a0122fbb8d6a02eb01304705a3c19a78b9c431bf0840fcd9389f" exitCode=0 Mar 19 12:09:53.005889 master-0 kubenswrapper[29612]: E0319 12:09:53.005819 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:09:53.020720 master-0 kubenswrapper[29612]: I0319 12:09:53.020668 29612 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="fc11d745bcb55f0325836aa72854476958c768ae4522a0e6c750cb0c84bb6fcd" exitCode=0 Mar 19 12:09:53.020720 master-0 kubenswrapper[29612]: I0319 12:09:53.020707 29612 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="52e44a998be1c2142f3874b5ff4a8f40902b1f922d23a5db895af0f402eb6218" exitCode=0 Mar 19 12:09:53.020720 master-0 kubenswrapper[29612]: I0319 12:09:53.020719 29612 generic.go:334] "Generic (PLEG): container finished" podID="094204df314fe45bd5af12ca1b4622bb" containerID="fc1a5002078bd201b824e0cb01e13c7c268e7dc6df2889e483abb0d9ab95ab7f" exitCode=0 Mar 19 12:09:53.026407 master-0 kubenswrapper[29612]: I0319 12:09:53.026315 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6f97756bc8-9fvr8_d52981e0-0bc4-4dfa-81d5-aeab63fa23e0/control-plane-machine-set-operator/0.log" Mar 19 12:09:53.026524 master-0 kubenswrapper[29612]: I0319 12:09:53.026417 29612 generic.go:334] "Generic (PLEG): container finished" podID="d52981e0-0bc4-4dfa-81d5-aeab63fa23e0" containerID="064b176871fcbc879687ac7e9904eb1df9dbafb853b4d8ce41a9bbb9767a5a2c" exitCode=1 Mar 19 12:09:53.031245 master-0 kubenswrapper[29612]: I0319 12:09:53.031195 29612 generic.go:334] "Generic (PLEG): container finished" podID="b0133cb5-e425-40af-a051-71b2362a89c3" containerID="a0078d1886890ad6f9b53500e856885a52b81c9832530c9463ebafbbef1dc099" exitCode=0 Mar 19 12:09:53.033206 master-0 kubenswrapper[29612]: I0319 12:09:53.033160 29612 generic.go:334] "Generic (PLEG): container finished" podID="2452a909-6aa1-4750-9399-5945d380427a" containerID="ec93bf01fde32bcc52dde2502ea534c23c521da7002fa0839646b010fe2c3b0c" exitCode=0 Mar 19 12:09:53.039664 master-0 kubenswrapper[29612]: I0319 12:09:53.038393 29612 generic.go:334] "Generic (PLEG): container finished" podID="d263ddac-1ede-474f-b3bb-1f1f4f7846a3" containerID="7fc3d4238bc0fa4082c65174aac64034c89fc65c891756149ed46eacde6bb87a" exitCode=0 Mar 19 12:09:53.064715 master-0 kubenswrapper[29612]: I0319 12:09:53.064663 29612 generic.go:334] "Generic (PLEG): container finished" podID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerID="882b540d45f96873ea08eb024547ab884cf7ba3eaac3163e6d0d3525a76f6dd5" exitCode=0 Mar 19 12:09:53.069720 master-0 kubenswrapper[29612]: I0319 12:09:53.069609 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_1249822f86f23526277d165c0d5d3c19/kube-rbac-proxy-crio/2.log" Mar 19 12:09:53.070334 master-0 kubenswrapper[29612]: I0319 12:09:53.070308 29612 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb" exitCode=1 Mar 19 12:09:53.070557 master-0 kubenswrapper[29612]: I0319 12:09:53.070529 29612 generic.go:334] "Generic (PLEG): container finished" podID="1249822f86f23526277d165c0d5d3c19" containerID="763a20d7b98822675af380cf6397cfba29e067a2eddf07fc09dc4a05e4d6fa6d" exitCode=0 Mar 19 12:09:53.074355 master-0 kubenswrapper[29612]: I0319 12:09:53.074324 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-7b95f86987-ftz5b_94fa5656-0561-45ab-9ab1-a6b96b8a47d4/package-server-manager/0.log" Mar 19 12:09:53.075096 master-0 kubenswrapper[29612]: I0319 12:09:53.075060 29612 generic.go:334] "Generic (PLEG): container finished" podID="94fa5656-0561-45ab-9ab1-a6b96b8a47d4" containerID="344d526c0d5c832dab03c63929de32c376b56407a1d6deb8f5c0fba230eec603" exitCode=1 Mar 19 12:09:53.086251 master-0 kubenswrapper[29612]: I0319 12:09:53.086200 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_de6765b7-3576-4171-972b-20d62d81f25d/installer/0.log" Mar 19 12:09:53.086474 master-0 kubenswrapper[29612]: I0319 12:09:53.086257 29612 generic.go:334] "Generic (PLEG): container finished" podID="de6765b7-3576-4171-972b-20d62d81f25d" containerID="f6792bda14390e5172a259445357feef34047b3b4035a2f72b67e0cd845a049a" exitCode=1 Mar 19 12:09:53.093077 master-0 kubenswrapper[29612]: I0319 12:09:53.093006 29612 generic.go:334] "Generic (PLEG): container finished" podID="1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6" containerID="285371d8dc1391b2aefa34f9aa23ab25dae7de5c3d27157be1b42472a35ec621" exitCode=0 Mar 19 12:09:53.101915 master-0 kubenswrapper[29612]: I0319 12:09:53.101866 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-64854d9cff-6dlpw_0c804ea0-3483-4506-b138-3bf30016d8d8/snapshot-controller/3.log" Mar 19 12:09:53.102063 master-0 kubenswrapper[29612]: I0319 12:09:53.101932 29612 generic.go:334] "Generic (PLEG): container finished" podID="0c804ea0-3483-4506-b138-3bf30016d8d8" containerID="a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec" exitCode=1 Mar 19 12:09:53.105566 master-0 kubenswrapper[29612]: I0319 12:09:53.105540 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-57777556ff-vxsl4_60081e3b-4fe5-42a3-bfae-0b07853c0e36/manager/0.log" Mar 19 12:09:53.105768 master-0 kubenswrapper[29612]: I0319 12:09:53.105745 29612 generic.go:334] "Generic (PLEG): container finished" podID="60081e3b-4fe5-42a3-bfae-0b07853c0e36" containerID="fbe4d38a848a8f8c8ac70c6aaafe195a243cd98f3045f416707deb69b549d12e" exitCode=1 Mar 19 12:09:53.108565 master-0 kubenswrapper[29612]: I0319 12:09:53.108532 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-b865698dc-czzvw_3f3c0234-248a-4d6f-9aca-6582a481a911/service-ca-operator/3.log" Mar 19 12:09:53.108681 master-0 kubenswrapper[29612]: I0319 12:09:53.108589 29612 generic.go:334] "Generic (PLEG): container finished" podID="3f3c0234-248a-4d6f-9aca-6582a481a911" containerID="9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c" exitCode=255 Mar 19 12:09:53.112529 master-0 kubenswrapper[29612]: I0319 12:09:53.112482 29612 generic.go:334] "Generic (PLEG): container finished" podID="518e1c6f-7f46-4dda-84a3-49fec9fa1abe" containerID="f7738588625037912affdf5435640d17413e9cd9036722fe4bd36750bce1dc90" exitCode=0 Mar 19 12:09:53.114309 master-0 kubenswrapper[29612]: I0319 12:09:53.114289 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/config-sync-controllers/0.log" Mar 19 12:09:53.114680 master-0 kubenswrapper[29612]: I0319 12:09:53.114659 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7dff898856-8pnq2_e47d3935-d96e-4fd2-abfc-99bff7eac5e0/cluster-cloud-controller-manager/0.log" Mar 19 12:09:53.114751 master-0 kubenswrapper[29612]: I0319 12:09:53.114687 29612 generic.go:334] "Generic (PLEG): container finished" podID="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" containerID="1807478431eec6d4ebe9e175a729aeaa97be5330ccb6e1fb9986d071c23571b4" exitCode=1 Mar 19 12:09:53.114751 master-0 kubenswrapper[29612]: I0319 12:09:53.114699 29612 generic.go:334] "Generic (PLEG): container finished" podID="e47d3935-d96e-4fd2-abfc-99bff7eac5e0" containerID="79d5dba168afeffdccc6ff9b4390e9029f185749a05da6ead57ad3001b8c3183" exitCode=1 Mar 19 12:09:53.135731 master-0 kubenswrapper[29612]: I0319 12:09:53.135045 29612 generic.go:334] "Generic (PLEG): container finished" podID="d979be56-f948-4252-ab6b-9bfc48dc4187" containerID="ff2ebf3fa6730575b9447497330c1d66a64543dc66430125663573c61143b763" exitCode=0 Mar 19 12:09:53.135731 master-0 kubenswrapper[29612]: I0319 12:09:53.135083 29612 generic.go:334] "Generic (PLEG): container finished" podID="d979be56-f948-4252-ab6b-9bfc48dc4187" containerID="9340a8d37e83e7908c779e917b656b3529e86ab0084fd52b34de9d95fe7b99e9" exitCode=0 Mar 19 12:09:53.136847 master-0 kubenswrapper[29612]: I0319 12:09:53.136823 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-6864dc98f7-4np9g_58a4e3af-c5e1-4487-92a2-81dfb696695e/manager/0.log" Mar 19 12:09:53.137234 master-0 kubenswrapper[29612]: I0319 12:09:53.137202 29612 generic.go:334] "Generic (PLEG): container finished" podID="58a4e3af-c5e1-4487-92a2-81dfb696695e" containerID="3d22a1705e322b07bf63dc93baa238f057583ee120dd5b2d2d08b50e9d7a1b09" exitCode=1 Mar 19 12:09:53.145771 master-0 kubenswrapper[29612]: I0319 12:09:53.145722 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5c7d7c6bb8-gnz2z_66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e/oauth-apiserver/0.log" Mar 19 12:09:53.150298 master-0 kubenswrapper[29612]: I0319 12:09:53.150249 29612 generic.go:334] "Generic (PLEG): container finished" podID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerID="f4990111c5770baec2a2ee082f291300cc7420fd0573bb028eff8f0a2e8424eb" exitCode=1 Mar 19 12:09:53.150298 master-0 kubenswrapper[29612]: I0319 12:09:53.150293 29612 generic.go:334] "Generic (PLEG): container finished" podID="66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e" containerID="d8055526d32c6d4c79231e8f49c9d8851d0cddeac9c1e90af414749e5818720e" exitCode=0 Mar 19 12:09:53.174152 master-0 kubenswrapper[29612]: I0319 12:09:53.174086 29612 generic.go:334] "Generic (PLEG): container finished" podID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" containerID="bd4cbdd654e294cf12e1078774a4064deac39d97874d1f397bebdf9625d6fe36" exitCode=0 Mar 19 12:09:53.193469 master-0 kubenswrapper[29612]: I0319 12:09:53.193409 29612 generic.go:334] "Generic (PLEG): container finished" podID="84e5577e-df8b-46a2-aff7-0bf90b5010d9" containerID="d743fbff3bc780b360a4927f7a272f0fb5047ed94867c7848a01cf51863441ae" exitCode=0 Mar 19 12:09:53.206762 master-0 kubenswrapper[29612]: E0319 12:09:53.206698 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:09:53.218081 master-0 kubenswrapper[29612]: I0319 12:09:53.218047 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5c6485487f-b2w5m_81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a/machine-approver-controller/0.log" Mar 19 12:09:53.219887 master-0 kubenswrapper[29612]: I0319 12:09:53.219693 29612 generic.go:334] "Generic (PLEG): container finished" podID="81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a" containerID="7fc5e15dbb622b807625a944b8961634a70f1aecfe4e4c996f59e0fe79999af9" exitCode=255 Mar 19 12:09:53.238278 master-0 kubenswrapper[29612]: I0319 12:09:53.238241 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5885bfd7f4-vt498_162b513f-2f67-4946-816b-48f6232dfca0/authentication-operator/1.log" Mar 19 12:09:53.238478 master-0 kubenswrapper[29612]: I0319 12:09:53.238299 29612 generic.go:334] "Generic (PLEG): container finished" podID="162b513f-2f67-4946-816b-48f6232dfca0" containerID="2ae789390c2dc2661b322d4ce95cf71fe928a1384e00118bd9c25a6d60f35352" exitCode=1 Mar 19 12:09:53.253144 master-0 kubenswrapper[29612]: I0319 12:09:53.253081 29612 generic.go:334] "Generic (PLEG): container finished" podID="4705c8f3-89d4-42f2-ad14-e591c5380b9e" containerID="8cf7c92799c042639046f9139208de62b93f5c3c4be8bc7881686ff116d251d3" exitCode=0 Mar 19 12:09:53.253144 master-0 kubenswrapper[29612]: I0319 12:09:53.253114 29612 generic.go:334] "Generic (PLEG): container finished" podID="4705c8f3-89d4-42f2-ad14-e591c5380b9e" containerID="b8135311023b73f1dff2f33a6eb221f0fcb0bdc9e1cd016fb390a5b4192441f6" exitCode=0 Mar 19 12:09:53.256192 master-0 kubenswrapper[29612]: I0319 12:09:53.256164 29612 generic.go:334] "Generic (PLEG): container finished" podID="24bd6a9b-9934-4cf9-b591-1ab892c8014c" containerID="7b4959df4babfe207d0b7474ab31deaecbea2e2b9d22f6c7f79a7c10a529fc14" exitCode=0 Mar 19 12:09:53.256192 master-0 kubenswrapper[29612]: I0319 12:09:53.256185 29612 generic.go:334] "Generic (PLEG): container finished" podID="24bd6a9b-9934-4cf9-b591-1ab892c8014c" containerID="237540f70e2c29f3039bdaf2115302712ba9278f06779f1d068dbc0f51f6e8cd" exitCode=0 Mar 19 12:09:53.259249 master-0 kubenswrapper[29612]: I0319 12:09:53.259211 29612 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="61ceef49c4b3e3c92de2f92ba07df5cc9f3437ed0a986fcdd8cd9a85e9afbe78" exitCode=0 Mar 19 12:09:53.259249 master-0 kubenswrapper[29612]: I0319 12:09:53.259232 29612 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="c0d30e67fd289a28538ef92cb1f3d25d05768acb2306cf21bc6d90958cdbdf09" exitCode=0 Mar 19 12:09:53.259249 master-0 kubenswrapper[29612]: I0319 12:09:53.259240 29612 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="ff56157a2c5ab60201123fe865c3891754204ecbb4e8ba10184f66ac03510978" exitCode=0 Mar 19 12:09:53.259249 master-0 kubenswrapper[29612]: I0319 12:09:53.259249 29612 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="20f4956468238d356cba6c4e8281a37386fede658fd1e93306b89426a80c8130" exitCode=0 Mar 19 12:09:53.259249 master-0 kubenswrapper[29612]: I0319 12:09:53.259255 29612 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="57e2b9e5df1e4dbc4f600dfa7e9acded55e04fe34916a6250fcf8bdcd8821c9c" exitCode=0 Mar 19 12:09:53.259249 master-0 kubenswrapper[29612]: I0319 12:09:53.259262 29612 generic.go:334] "Generic (PLEG): container finished" podID="c479beae-6d34-4d49-81c8-39f6a53ff6ca" containerID="7e880bfa7477300228e9f011e131c05d9e3d415f1d3d007729e1d9e7b578b4cc" exitCode=0 Mar 19 12:09:53.262133 master-0 kubenswrapper[29612]: I0319 12:09:53.262071 29612 generic.go:334] "Generic (PLEG): container finished" podID="bea42722-d7af-4938-9f3a-da57bd056f96" containerID="15fc833de80b2cae87e3ef1229c5aae30a9051318a9e0831bf339fc664c1dd1c" exitCode=0 Mar 19 12:09:53.266709 master-0 kubenswrapper[29612]: I0319 12:09:53.263402 29612 generic.go:334] "Generic (PLEG): container finished" podID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerID="825551a6cbd6e2d9a62ce681600e2897018550eae2ad6b2a4c579d2b878da6e9" exitCode=0 Mar 19 12:09:53.269088 master-0 kubenswrapper[29612]: I0319 12:09:53.269046 29612 generic.go:334] "Generic (PLEG): container finished" podID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerID="da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b" exitCode=0 Mar 19 12:09:53.274065 master-0 kubenswrapper[29612]: I0319 12:09:53.274034 29612 generic.go:334] "Generic (PLEG): container finished" podID="4d625e92-793e-4942-a826-d30d21e3ca0c" containerID="023b8459d54078f874c794a42aed9a6fd6fbb4ddfa6be97808a092ec4ffe3187" exitCode=0 Mar 19 12:09:53.274065 master-0 kubenswrapper[29612]: I0319 12:09:53.274058 29612 generic.go:334] "Generic (PLEG): container finished" podID="4d625e92-793e-4942-a826-d30d21e3ca0c" containerID="67dee3634bf8afc5cf33505c6686845e34de0d09fbd040277b4cc0507631dc5b" exitCode=0 Mar 19 12:09:53.276130 master-0 kubenswrapper[29612]: I0319 12:09:53.276079 29612 generic.go:334] "Generic (PLEG): container finished" podID="003e840c-1b07-4624-8225-e4ffdab74213" containerID="e2e764dd4d611abac99b99d586297942d72b2e3f1434058cd7e0a11729f21ba8" exitCode=0 Mar 19 12:09:53.276130 master-0 kubenswrapper[29612]: I0319 12:09:53.276124 29612 generic.go:334] "Generic (PLEG): container finished" podID="003e840c-1b07-4624-8225-e4ffdab74213" containerID="0d16f994259c86f779a870061760bfaa564ec639686f9c38a22c5ca129c5726f" exitCode=0 Mar 19 12:09:53.279528 master-0 kubenswrapper[29612]: I0319 12:09:53.279496 29612 generic.go:334] "Generic (PLEG): container finished" podID="5f2d1d03-7427-45a0-a917-173bf9df9902" containerID="3b0c232c84654528ea64bdd18d2136082e46bb45cce0608d6a300d9ff0fe9984" exitCode=0 Mar 19 12:09:53.281875 master-0 kubenswrapper[29612]: I0319 12:09:53.281849 29612 generic.go:334] "Generic (PLEG): container finished" podID="367ea4e8-8172-4730-8f26-499ed7c167bb" containerID="ec872b5060e715523dcf7612ce03c05df430a828612dcdbf8f83f43994a2ae84" exitCode=0 Mar 19 12:09:53.286113 master-0 kubenswrapper[29612]: I0319 12:09:53.286051 29612 generic.go:334] "Generic (PLEG): container finished" podID="aab47a39-a18b-4350-b141-4d5de2d8e96f" containerID="3385d26311ce81e00023dfaf9b5057451bdc1320414436c96c2f92229792ca5f" exitCode=0 Mar 19 12:09:53.288977 master-0 kubenswrapper[29612]: I0319 12:09:53.288940 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/4.log" Mar 19 12:09:53.289383 master-0 kubenswrapper[29612]: I0319 12:09:53.289350 29612 generic.go:334] "Generic (PLEG): container finished" podID="f38df702-a256-4f83-90bb-0c0e477a2ce6" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" exitCode=1 Mar 19 12:09:53.291797 master-0 kubenswrapper[29612]: I0319 12:09:53.291772 29612 generic.go:334] "Generic (PLEG): container finished" podID="6d017409-a362-4e58-87af-d02b3834fdd4" containerID="016ff43860287ef9ea9c7834d6813f132931ea72467cbcc206c5b8bf5a2a1e8b" exitCode=0 Mar 19 12:09:53.295711 master-0 kubenswrapper[29612]: I0319 12:09:53.295191 29612 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" exitCode=0 Mar 19 12:09:53.295711 master-0 kubenswrapper[29612]: I0319 12:09:53.295213 29612 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" exitCode=0 Mar 19 12:09:53.295711 master-0 kubenswrapper[29612]: I0319 12:09:53.295219 29612 generic.go:334] "Generic (PLEG): container finished" podID="49fac1b46a11e49501805e891baae4a9" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" exitCode=0 Mar 19 12:09:53.300261 master-0 kubenswrapper[29612]: I0319 12:09:53.300232 29612 generic.go:334] "Generic (PLEG): container finished" podID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerID="171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464" exitCode=0 Mar 19 12:09:53.302232 master-0 kubenswrapper[29612]: I0319 12:09:53.302200 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-47j68_51e2c1d1-a50f-4a0d-8273-440e699b3907/approver/1.log" Mar 19 12:09:53.302517 master-0 kubenswrapper[29612]: I0319 12:09:53.302491 29612 generic.go:334] "Generic (PLEG): container finished" podID="51e2c1d1-a50f-4a0d-8273-440e699b3907" containerID="6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef" exitCode=1 Mar 19 12:09:53.307377 master-0 kubenswrapper[29612]: I0319 12:09:53.307348 29612 generic.go:334] "Generic (PLEG): container finished" podID="ac946049-4b93-4d8c-bfa6-eacf83160ed1" containerID="c8521feb576750f4b4cf92dbf1c496a19af50a44e812267935ec3c28f71bb348" exitCode=0 Mar 19 12:09:53.308749 master-0 kubenswrapper[29612]: I0319 12:09:53.308703 29612 generic.go:334] "Generic (PLEG): container finished" podID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerID="64567b7ee5dbb1259b892f9e1f4fd87cb14b22e497b596ff7c977ffde93bc8bd" exitCode=0 Mar 19 12:09:53.310552 master-0 kubenswrapper[29612]: I0319 12:09:53.310488 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:09:53.311965 master-0 kubenswrapper[29612]: I0319 12:09:53.311928 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:09:53.311965 master-0 kubenswrapper[29612]: I0319 12:09:53.311960 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" exitCode=255 Mar 19 12:09:53.312090 master-0 kubenswrapper[29612]: I0319 12:09:53.311972 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f" exitCode=1 Mar 19 12:09:53.314224 master-0 kubenswrapper[29612]: I0319 12:09:53.314196 29612 generic.go:334] "Generic (PLEG): container finished" podID="39f479b3-45ff-43f9-ae85-083554a54918" containerID="9d5a5757181aa7140885d0b2bdc37c14b8a849c385bb89cb00edfe5781ce2cdf" exitCode=0 Mar 19 12:09:53.315857 master-0 kubenswrapper[29612]: I0319 12:09:53.315618 29612 generic.go:334] "Generic (PLEG): container finished" podID="1ec99218-1644-461e-abb2-14d0c5369e96" containerID="16da51ad34fc7222c4ea68656e0145a3d1eba549a2e561b21ed00a07283c919f" exitCode=0 Mar 19 12:09:53.607296 master-0 kubenswrapper[29612]: E0319 12:09:53.607229 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:09:53.818752 master-0 kubenswrapper[29612]: I0319 12:09:53.818614 29612 apiserver.go:52] "Watching apiserver" Mar 19 12:09:53.841753 master-0 kubenswrapper[29612]: I0319 12:09:53.841664 29612 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 12:09:54.407525 master-0 kubenswrapper[29612]: E0319 12:09:54.407463 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:09:56.007897 master-0 kubenswrapper[29612]: E0319 12:09:56.007798 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:09:59.209009 master-0 kubenswrapper[29612]: E0319 12:09:59.208900 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:10:04.209741 master-0 kubenswrapper[29612]: E0319 12:10:04.209675 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:10:09.210661 master-0 kubenswrapper[29612]: E0319 12:10:09.210575 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:10:14.211107 master-0 kubenswrapper[29612]: E0319 12:10:14.211030 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:10:19.211779 master-0 kubenswrapper[29612]: E0319 12:10:19.211708 29612 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 12:10:22.110728 master-0 kubenswrapper[29612]: E0319 12:10:22.110662 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46108693_3c9e_4804_beb0_d6b8f8df7625.slice/crio-e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5\": RecentStats: unable to find data in memory cache]" Mar 19 12:10:22.111251 master-0 kubenswrapper[29612]: E0319 12:10:22.110816 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46108693_3c9e_4804_beb0_d6b8f8df7625.slice/crio-e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5\": RecentStats: unable to find data in memory cache]" Mar 19 12:10:22.115171 master-0 kubenswrapper[29612]: E0319 12:10:22.115122 29612 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/kubepods.slice\": failed to get container info for \"/kubepods.slice\": unknown container \"/kubepods.slice\"" containerName="/kubepods.slice" Mar 19 12:10:22.115950 master-0 kubenswrapper[29612]: E0319 12:10:22.115915 29612 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/kubepods.slice\": failed to get container info for \"/kubepods.slice\": unknown container \"/kubepods.slice\"" containerName="/kubepods.slice" Mar 19 12:10:22.456762 master-0 kubenswrapper[29612]: I0319 12:10:22.454047 29612 manager.go:324] Recovery completed Mar 19 12:10:22.530291 master-0 kubenswrapper[29612]: I0319 12:10:22.530228 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:10:22.532482 master-0 kubenswrapper[29612]: I0319 12:10:22.532444 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:10:22.533072 master-0 kubenswrapper[29612]: I0319 12:10:22.533039 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:10:22.533137 master-0 kubenswrapper[29612]: I0319 12:10:22.533084 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b" exitCode=1 Mar 19 12:10:22.537370 master-0 kubenswrapper[29612]: I0319 12:10:22.537337 29612 generic.go:334] "Generic (PLEG): container finished" podID="46108693-3c9e-4804-beb0-d6b8f8df7625" containerID="302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30" exitCode=0 Mar 19 12:10:22.568898 master-0 kubenswrapper[29612]: I0319 12:10:22.568847 29612 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 12:10:22.568898 master-0 kubenswrapper[29612]: I0319 12:10:22.568881 29612 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 12:10:22.569569 master-0 kubenswrapper[29612]: I0319 12:10:22.568922 29612 state_mem.go:36] "Initialized new in-memory state store" Mar 19 12:10:22.569569 master-0 kubenswrapper[29612]: I0319 12:10:22.569127 29612 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 19 12:10:22.569569 master-0 kubenswrapper[29612]: I0319 12:10:22.569141 29612 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 19 12:10:22.569569 master-0 kubenswrapper[29612]: I0319 12:10:22.569168 29612 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 19 12:10:22.569569 master-0 kubenswrapper[29612]: I0319 12:10:22.569176 29612 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 19 12:10:22.569569 master-0 kubenswrapper[29612]: I0319 12:10:22.569184 29612 policy_none.go:49] "None policy: Start" Mar 19 12:10:22.592897 master-0 kubenswrapper[29612]: I0319 12:10:22.592597 29612 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 12:10:22.592897 master-0 kubenswrapper[29612]: I0319 12:10:22.592783 29612 state_mem.go:35] "Initializing new in-memory state store" Mar 19 12:10:22.593382 master-0 kubenswrapper[29612]: I0319 12:10:22.593354 29612 state_mem.go:75] "Updated machine memory state" Mar 19 12:10:22.593382 master-0 kubenswrapper[29612]: I0319 12:10:22.593381 29612 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 19 12:10:22.610847 master-0 kubenswrapper[29612]: I0319 12:10:22.610797 29612 manager.go:334] "Starting Device Plugin manager" Mar 19 12:10:22.611066 master-0 kubenswrapper[29612]: I0319 12:10:22.610861 29612 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 12:10:22.611066 master-0 kubenswrapper[29612]: I0319 12:10:22.610875 29612 server.go:79] "Starting device plugin registration server" Mar 19 12:10:22.611370 master-0 kubenswrapper[29612]: I0319 12:10:22.611336 29612 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 12:10:22.611434 master-0 kubenswrapper[29612]: I0319 12:10:22.611359 29612 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 12:10:22.611553 master-0 kubenswrapper[29612]: I0319 12:10:22.611529 29612 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 12:10:22.611708 master-0 kubenswrapper[29612]: I0319 12:10:22.611625 29612 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 12:10:22.611708 master-0 kubenswrapper[29612]: I0319 12:10:22.611659 29612 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 12:10:22.711804 master-0 kubenswrapper[29612]: I0319 12:10:22.711678 29612 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 12:10:22.714429 master-0 kubenswrapper[29612]: I0319 12:10:22.714386 29612 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 19 12:10:22.714429 master-0 kubenswrapper[29612]: I0319 12:10:22.714421 29612 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 19 12:10:22.714429 master-0 kubenswrapper[29612]: I0319 12:10:22.714429 29612 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 19 12:10:22.714677 master-0 kubenswrapper[29612]: I0319 12:10:22.714501 29612 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 19 12:10:22.725766 master-0 kubenswrapper[29612]: I0319 12:10:22.725723 29612 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 19 12:10:22.725962 master-0 kubenswrapper[29612]: I0319 12:10:22.725835 29612 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 19 12:10:24.212382 master-0 kubenswrapper[29612]: I0319 12:10:24.212293 29612 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:10:24.212970 master-0 kubenswrapper[29612]: I0319 12:10:24.212830 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x","openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z","openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg","openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g","openshift-marketplace/redhat-marketplace-6zxlv","openshift-kube-controller-manager/installer-4-master-0","openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj","openshift-kube-controller-manager/installer-2-master-0","openshift-service-ca/service-ca-79bc6b8d76-pdjtv","openshift-ingress-operator/ingress-operator-66b84d69b-59ptg","openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5","openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw","openshift-etcd/installer-1-master-0","openshift-etcd/installer-2-master-0","openshift-kube-scheduler/installer-5-master-0","openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8","openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b","openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9","openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498","openshift-ovn-kubernetes/ovnkube-node-hm6q6","openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt","openshift-dns/dns-default-pgl92","openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6","openshift-dns-operator/dns-operator-9c5679d8f-m5dt4","openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj","openshift-machine-config-operator/machine-config-server-x66j8","openshift-monitoring/metrics-server-dc85487cf-gsmlh","openshift-network-operator/iptables-alerter-j6n2j","openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6","openshift-controller-manager/controller-manager-79589db566-gb7tq","openshift-kube-apiserver/installer-3-master-0","openshift-kube-controller-manager/installer-3-master-0","openshift-marketplace/community-operators-tzckk","openshift-multus/multus-additional-cni-plugins-n5wqf","openshift-multus/network-metrics-daemon-nkd77","openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4","openshift-ingress-canary/ingress-canary-tzfl9","openshift-ingress/router-default-7dcf5569b5-47tqf","openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b","openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq","openshift-marketplace/certified-operators-4h84v","openshift-marketplace/marketplace-operator-89ccd998f-4h6xn","openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg","openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw","openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp","openshift-dns/node-resolver-j2kp7","openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g","openshift-insights/insights-operator-68bf6ff9d6-jn2px","openshift-kube-storage-version-migrator/migrator-8487694857-52lv2","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb","openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx","openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj","openshift-marketplace/redhat-operators-6ql8d","openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj","openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp","openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm","openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j","openshift-network-diagnostics/network-check-target-tz4qg","assisted-installer/assisted-installer-controller-r6tsg","openshift-apiserver/apiserver-66dcf44d9d-46w4r","openshift-monitoring/node-exporter-d48mb","openshift-machine-config-operator/machine-config-daemon-tmgtz","openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv","openshift-network-operator/network-operator-7bd846bfc4-5qtx7","openshift-etcd/etcd-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-multus/multus-p5wll","openshift-network-node-identity/network-node-identity-47j68","openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4","openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t","openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv","openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf","openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m","openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht","openshift-kube-apiserver/installer-1-master-0","openshift-kube-scheduler/installer-4-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8","openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj","openshift-cluster-node-tuning-operator/tuned-9w65f","openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr"] Mar 19 12:10:24.213527 master-0 kubenswrapper[29612]: I0319 12:10:24.213440 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6tsg" Mar 19 12:10:24.231977 master-0 kubenswrapper[29612]: I0319 12:10:24.231538 29612 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="27b79ae3-e8c0-4562-b1dc-164c9cc7ebb6" Mar 19 12:10:24.231977 master-0 kubenswrapper[29612]: I0319 12:10:24.231726 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 12:10:24.232256 master-0 kubenswrapper[29612]: I0319 12:10:24.232148 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 12:10:24.236880 master-0 kubenswrapper[29612]: I0319 12:10:24.236821 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 12:10:24.237494 master-0 kubenswrapper[29612]: I0319 12:10:24.237474 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.237723 master-0 kubenswrapper[29612]: I0319 12:10:24.237680 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.238565 master-0 kubenswrapper[29612]: I0319 12:10:24.238530 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 19 12:10:24.239108 master-0 kubenswrapper[29612]: I0319 12:10:24.239078 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.240524 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.241785 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.242818 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.242870 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.243250 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.243419 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.243587 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 12:10:24.245119 master-0 kubenswrapper[29612]: I0319 12:10:24.245123 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.245657 master-0 kubenswrapper[29612]: I0319 12:10:24.245474 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 12:10:24.245783 master-0 kubenswrapper[29612]: I0319 12:10:24.245749 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.246069 master-0 kubenswrapper[29612]: I0319 12:10:24.246034 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 12:10:24.246169 master-0 kubenswrapper[29612]: I0319 12:10:24.246142 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 12:10:24.246215 master-0 kubenswrapper[29612]: I0319 12:10:24.246063 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 12:10:24.246330 master-0 kubenswrapper[29612]: I0319 12:10:24.246310 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 12:10:24.246379 master-0 kubenswrapper[29612]: I0319 12:10:24.246338 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 12:10:24.246379 master-0 kubenswrapper[29612]: I0319 12:10:24.246364 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:10:24.246477 master-0 kubenswrapper[29612]: I0319 12:10:24.246460 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 12:10:24.246525 master-0 kubenswrapper[29612]: I0319 12:10:24.246490 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.246660 master-0 kubenswrapper[29612]: I0319 12:10:24.246627 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 12:10:24.246721 master-0 kubenswrapper[29612]: I0319 12:10:24.246679 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 12:10:24.246761 master-0 kubenswrapper[29612]: I0319 12:10:24.246729 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 12:10:24.246761 master-0 kubenswrapper[29612]: I0319 12:10:24.246749 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 12:10:24.246822 master-0 kubenswrapper[29612]: I0319 12:10:24.246796 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.246895 master-0 kubenswrapper[29612]: I0319 12:10:24.246873 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 12:10:24.246961 master-0 kubenswrapper[29612]: I0319 12:10:24.246919 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 12:10:24.246961 master-0 kubenswrapper[29612]: I0319 12:10:24.246928 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.247046 master-0 kubenswrapper[29612]: I0319 12:10:24.246970 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 12:10:24.247046 master-0 kubenswrapper[29612]: I0319 12:10:24.246970 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 12:10:24.247046 master-0 kubenswrapper[29612]: I0319 12:10:24.247021 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 12:10:24.247169 master-0 kubenswrapper[29612]: I0319 12:10:24.247069 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 12:10:24.247169 master-0 kubenswrapper[29612]: I0319 12:10:24.247130 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 12:10:24.247169 master-0 kubenswrapper[29612]: I0319 12:10:24.247147 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 12:10:24.247266 master-0 kubenswrapper[29612]: I0319 12:10:24.247176 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 12:10:24.247266 master-0 kubenswrapper[29612]: I0319 12:10:24.247227 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 12:10:24.247322 master-0 kubenswrapper[29612]: I0319 12:10:24.247274 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 12:10:24.247351 master-0 kubenswrapper[29612]: I0319 12:10:24.247324 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.247351 master-0 kubenswrapper[29612]: I0319 12:10:24.247343 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:10:24.247415 master-0 kubenswrapper[29612]: I0319 12:10:24.247379 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 12:10:24.247538 master-0 kubenswrapper[29612]: I0319 12:10:24.247517 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 12:10:24.247583 master-0 kubenswrapper[29612]: I0319 12:10:24.247574 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 12:10:24.247721 master-0 kubenswrapper[29612]: I0319 12:10:24.247692 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 12:10:24.247788 master-0 kubenswrapper[29612]: I0319 12:10:24.247738 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.247832 master-0 kubenswrapper[29612]: I0319 12:10:24.246496 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 12:10:24.247918 master-0 kubenswrapper[29612]: I0319 12:10:24.246876 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 12:10:24.247977 master-0 kubenswrapper[29612]: I0319 12:10:24.247916 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 12:10:24.248028 master-0 kubenswrapper[29612]: I0319 12:10:24.248008 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248029 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248138 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248299 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248427 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248431 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248543 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.247917 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248660 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248762 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248819 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248867 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.248979 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249183 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249301 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249403 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249516 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249676 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249782 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249896 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.249995 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250104 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250211 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250347 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250447 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250544 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250567 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250702 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250739 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250747 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250935 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.250972 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.251743 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.252839 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.253384 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.253678 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.253747 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.253950 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 12:10:24.256789 master-0 kubenswrapper[29612]: I0319 12:10:24.254848 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 12:10:24.258044 master-0 kubenswrapper[29612]: I0319 12:10:24.257668 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 12:10:24.261762 master-0 kubenswrapper[29612]: I0319 12:10:24.260181 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 19 12:10:24.261762 master-0 kubenswrapper[29612]: I0319 12:10:24.261164 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 19 12:10:24.264265 master-0 kubenswrapper[29612]: I0319 12:10:24.264241 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 12:10:24.276762 master-0 kubenswrapper[29612]: I0319 12:10:24.273885 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 12:10:24.276762 master-0 kubenswrapper[29612]: E0319 12:10:24.274788 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.276762 master-0 kubenswrapper[29612]: E0319 12:10:24.274851 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.276762 master-0 kubenswrapper[29612]: E0319 12:10:24.275982 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.277048 master-0 kubenswrapper[29612]: E0319 12:10:24.276821 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.282551 master-0 kubenswrapper[29612]: I0319 12:10:24.282330 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 12:10:24.285728 master-0 kubenswrapper[29612]: I0319 12:10:24.285674 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 12:10:24.287712 master-0 kubenswrapper[29612]: I0319 12:10:24.287677 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-utilities\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:24.287805 master-0 kubenswrapper[29612]: I0319 12:10:24.287719 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-catalog-content\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:24.287805 master-0 kubenswrapper[29612]: I0319 12:10:24.287744 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7bl2\" (UniqueName: \"kubernetes.io/projected/d979be56-f948-4252-ab6b-9bfc48dc4187-kube-api-access-w7bl2\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:24.288038 master-0 kubenswrapper[29612]: I0319 12:10:24.288013 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-utilities\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:24.288202 master-0 kubenswrapper[29612]: I0319 12:10:24.288151 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d979be56-f948-4252-ab6b-9bfc48dc4187-catalog-content\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:24.321133 master-0 kubenswrapper[29612]: I0319 12:10:24.309911 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 12:10:24.321133 master-0 kubenswrapper[29612]: I0319 12:10:24.310119 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 12:10:24.321133 master-0 kubenswrapper[29612]: I0319 12:10:24.315348 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 12:10:24.321133 master-0 kubenswrapper[29612]: I0319 12:10:24.319888 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 12:10:24.324852 master-0 kubenswrapper[29612]: I0319 12:10:24.324602 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 12:10:24.325069 master-0 kubenswrapper[29612]: I0319 12:10:24.324979 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerStarted","Data":"d60d501384cd053df7d86d6b632c3daf437c4d9f2ef724d350e9c0d1c254c4aa"} Mar 19 12:10:24.325957 master-0 kubenswrapper[29612]: I0319 12:10:24.325181 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 19 12:10:24.325957 master-0 kubenswrapper[29612]: I0319 12:10:24.325446 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 19 12:10:24.326222 master-0 kubenswrapper[29612]: I0319 12:10:24.326129 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 19 12:10:24.326521 master-0 kubenswrapper[29612]: I0319 12:10:24.326403 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 19 12:10:24.334873 master-0 kubenswrapper[29612]: I0319 12:10:24.334832 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fac1b46a11e49501805e891baae4a9" path="/var/lib/kubelet/pods/49fac1b46a11e49501805e891baae4a9/volumes" Mar 19 12:10:24.335398 master-0 kubenswrapper[29612]: I0319 12:10:24.335384 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 12:10:24.336208 master-0 kubenswrapper[29612]: I0319 12:10:24.336167 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 12:10:24.355763 master-0 kubenswrapper[29612]: I0319 12:10:24.355722 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 12:10:24.374993 master-0 kubenswrapper[29612]: I0319 12:10:24.374936 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 12:10:24.377944 master-0 kubenswrapper[29612]: I0319 12:10:24.377917 29612 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 19 12:10:24.388488 master-0 kubenswrapper[29612]: I0319 12:10:24.388418 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:24.388488 master-0 kubenswrapper[29612]: I0319 12:10:24.388475 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:24.388488 master-0 kubenswrapper[29612]: I0319 12:10:24.388498 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 12:10:24.388819 master-0 kubenswrapper[29612]: I0319 12:10:24.388518 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 12:10:24.388819 master-0 kubenswrapper[29612]: I0319 12:10:24.388676 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:24.388819 master-0 kubenswrapper[29612]: I0319 12:10:24.388761 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:24.388819 master-0 kubenswrapper[29612]: I0319 12:10:24.388788 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-serving-cert\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.389001 master-0 kubenswrapper[29612]: I0319 12:10:24.388983 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389013 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-trusted-ca-bundle\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389071 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389096 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389116 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-node-pullsecrets\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389134 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389150 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389170 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389008 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac946049-4b93-4d8c-bfa6-eacf83160ed1-serving-cert\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389190 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389207 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389234 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389250 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389274 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389133 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-serving-cert\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389352 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389337 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07804be1-d37c-474e-a179-01ac541d9705-trusted-ca\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389411 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6e8c4676-05af-4478-a44b-1bdcbefdea0f-srv-cert\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389413 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389443 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389474 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 12:10:24.389590 master-0 kubenswrapper[29612]: I0319 12:10:24.389539 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-config\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389613 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389687 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-operand-assets\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389737 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389738 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d017409-a362-4e58-87af-d02b3834fdd4-serving-cert\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389872 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389919 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.389961 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390000 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390020 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390038 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390055 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-encryption-config\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390105 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390129 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390158 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390183 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390204 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-utilities\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390228 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysconfig\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390254 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390329 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-52ww6\" (UID: \"8117f6b0-0b9f-4a1d-9807-bf73c19f388b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390347 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-tmp\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390367 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390390 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390412 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390436 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390461 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5148a50-4332-4ab4-af63-449ef7f16333-hosts-file\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390482 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.390493 master-0 kubenswrapper[29612]: I0319 12:10:24.390504 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390526 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-stats-auth\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390575 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390601 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390656 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-snapshots\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390755 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-host\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390788 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390813 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390840 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390885 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390913 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-cabundle\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390938 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-sys\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.390981 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391005 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frf6\" (UniqueName: \"kubernetes.io/projected/07eef031-edcc-415c-91d7-f9f3bd0a45e3-kube-api-access-6frf6\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391033 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-utilities\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391045 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-sys\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391121 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-snapshots\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391182 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391212 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391238 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391263 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391291 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391319 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4khz\" (UniqueName: \"kubernetes.io/projected/a5d009a3-5ff2-49f9-9cea-004b99729b77-kube-api-access-z4khz\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391348 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-serving-cert\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391374 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391401 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391406 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-tmp\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391427 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391452 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391612 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-serving-cert\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391727 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391772 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f3c0234-248a-4d6f-9aca-6582a481a911-serving-cert\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391782 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-srv-cert\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391850 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd74j\" (UniqueName: \"kubernetes.io/projected/24bd6a9b-9934-4cf9-b591-1ab892c8014c-kube-api-access-jd74j\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391881 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391911 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-serving-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391922 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cni-binary-copy\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391947 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391964 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac946049-4b93-4d8c-bfa6-eacf83160ed1-config\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.391976 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5cs\" (UniqueName: \"kubernetes.io/projected/4d790b42-2182-4b9c-8391-285e938715cc-kube-api-access-zh5cs\") pod \"migrator-8487694857-52lv2\" (UID: \"4d790b42-2182-4b9c-8391-285e938715cc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392064 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-kube-api-access-t85jj\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392094 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392121 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/58a4e3af-c5e1-4487-92a2-81dfb696695e-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392201 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392207 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-config\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392232 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392267 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392279 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cert\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392295 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392363 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-tmpfs\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392391 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392413 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-dir\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392440 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392464 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69pp5\" (UniqueName: \"kubernetes.io/projected/4ca74427-9df7-44ca-ae91-0162f402644b-kube-api-access-69pp5\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392489 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392514 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjkt\" (UniqueName: \"kubernetes.io/projected/0c804ea0-3483-4506-b138-3bf30016d8d8-kube-api-access-pxjkt\") pod \"csi-snapshot-controller-64854d9cff-6dlpw\" (UID: \"0c804ea0-3483-4506-b138-3bf30016d8d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392609 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392657 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-serving-ca\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392688 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392705 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-tmpfs\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392715 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392741 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392773 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392803 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392829 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4gg\" (UniqueName: \"kubernetes.io/projected/7af28c70-3ee0-438b-8655-95bc57deaa05-kube-api-access-2m4gg\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392857 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392886 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392920 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.392994 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/07804be1-d37c-474e-a179-01ac541d9705-image-registry-operator-tls\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393078 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393094 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393108 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-modprobe-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393136 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/58a4e3af-c5e1-4487-92a2-81dfb696695e-cache\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393163 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4xb\" (UniqueName: \"kubernetes.io/projected/3a185e40-e97b-49e9-aea3-5f170401a9e3-kube-api-access-2v4xb\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393190 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393391 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393418 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvp77\" (UniqueName: \"kubernetes.io/projected/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-kube-api-access-rvp77\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393431 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-serving-cert\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393517 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5z4r\" (UniqueName: \"kubernetes.io/projected/a49a8a5e-23fb-46fb-b184-e09da4817028-kube-api-access-k5z4r\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393545 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393570 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393600 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393670 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393712 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0133cb5-e425-40af-a051-71b2362a89c3-metrics-tls\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393757 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393794 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/58a4e3af-c5e1-4487-92a2-81dfb696695e-cache\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393821 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ca74427-9df7-44ca-ae91-0162f402644b-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393849 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-config\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.393781 master-0 kubenswrapper[29612]: I0319 12:10:24.393851 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstrq\" (UniqueName: \"kubernetes.io/projected/f93ee7db-f0ed-470f-a458-86c54e6d064a-kube-api-access-jstrq\") pod \"network-check-source-b4bf74f6-5jjzp\" (UID: \"f93ee7db-f0ed-470f-a458-86c54e6d064a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.393910 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.393974 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4ca74427-9df7-44ca-ae91-0162f402644b-volume-directive-shadow\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.393984 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394015 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394051 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394066 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-policies\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394094 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-catalog-content\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394123 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394446 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e5577e-df8b-46a2-aff7-0bf90b5010d9-serving-cert\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394485 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394505 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394531 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r97ml\" (UniqueName: \"kubernetes.io/projected/63e81c6a-b723-4c9a-b471-226fefa9c7a7-kube-api-access-r97ml\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394560 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-utilities\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394602 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394626 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgsm4\" (UniqueName: \"kubernetes.io/projected/46108693-3c9e-4804-beb0-d6b8f8df7625-kube-api-access-wgsm4\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394663 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-catalog-content\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394669 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-conf\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394765 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-utilities\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394778 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-run\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394838 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394887 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394896 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394935 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.394960 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395030 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395076 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395111 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4cz\" (UniqueName: \"kubernetes.io/projected/4e7b04dc-e85d-41b8-959c-87b1b24731a1-kube-api-access-9h4cz\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395132 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5cc5\" (UniqueName: \"kubernetes.io/projected/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-kube-api-access-v5cc5\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395236 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/78d3b2df-3d81-413e-b039-80dc20111eae-trusted-ca\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395270 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395297 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395300 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6e54f-a750-4c91-b4f5-049275d33cfb-metrics-tls\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395326 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395351 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-key\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395385 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395405 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395428 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntj9\" (UniqueName: \"kubernetes.io/projected/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-kube-api-access-8ntj9\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395447 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395465 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvpz\" (UniqueName: \"kubernetes.io/projected/eb274dd5-e98e-4485-a050-7bec560a0daa-kube-api-access-jsvpz\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395481 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395655 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-key\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395678 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e90c10f6-36d9-44bd-a4c6-174f12135434-images\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395683 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395743 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395767 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395791 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395808 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395821 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395849 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395875 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-audit\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.395903 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396037 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396050 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-service-ca\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396068 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396091 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396114 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396114 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-audit\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396136 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-trusted-ca-bundle\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396160 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396210 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396245 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396266 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctghm\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-kube-api-access-ctghm\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396292 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396318 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396342 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-textfile\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396365 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6qg\" (UniqueName: \"kubernetes.io/projected/b92c4c09-a796-4e00-821a-32fc9c8ca214-kube-api-access-6h6qg\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396373 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f38df702-a256-4f83-90bb-0c0e477a2ce6-metrics-tls\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396389 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396580 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-textfile\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396660 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-trusted-ca-bundle\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396790 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396814 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396832 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396849 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396854 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d017409-a362-4e58-87af-d02b3834fdd4-config\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396868 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.396906 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397169 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhrb\" (UniqueName: \"kubernetes.io/projected/bcc25a4a-521a-4139-a10c-d5b8bba6e359-kube-api-access-2vhrb\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397203 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-client\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397220 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/162b513f-2f67-4946-816b-48f6232dfca0-serving-cert\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397227 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397254 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-kubernetes\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397292 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397410 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-catalog-content\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397438 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8765q\" (UniqueName: \"kubernetes.io/projected/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-kube-api-access-8765q\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397449 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-trusted-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397464 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397487 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397509 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397519 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d625e92-793e-4942-a826-d30d21e3ca0c-catalog-content\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397529 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gczvq\" (UniqueName: \"kubernetes.io/projected/4d625e92-793e-4942-a826-d30d21e3ca0c-kube-api-access-gczvq\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397560 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397585 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397610 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8qjv\" (UniqueName: \"kubernetes.io/projected/57d19c68-b469-4a6f-abe0-3a0eff32639c-kube-api-access-v8qjv\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397667 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397691 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397747 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-lib-modules\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397777 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 12:10:24.397664 master-0 kubenswrapper[29612]: I0319 12:10:24.397799 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397823 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397841 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397859 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397883 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397877 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-trusted-ca\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397904 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397933 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397957 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.397976 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398063 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398093 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398115 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398133 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398219 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398231 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398265 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398304 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398327 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398381 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398403 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398425 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398470 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398530 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398676 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398708 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398736 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65be54c-d851-4daa-93a0-8a3947da5e26-kube-api-access\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398774 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b48q2\" (UniqueName: \"kubernetes.io/projected/a5cf6404-1640-43d3-8018-03c62c8338c5-kube-api-access-b48q2\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398799 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38df702-a256-4f83-90bb-0c0e477a2ce6-trusted-ca\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398799 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398848 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398875 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398924 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398948 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398979 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.398999 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399020 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjj2t\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-kube-api-access-gjj2t\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399040 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-image-import-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399041 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f3c0234-248a-4d6f-9aca-6582a481a911-config\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399079 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399121 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399151 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399171 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-image-import-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399185 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399213 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/131e2573-ef74-4963-bdcf-901310e7a486-telemetry-config\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399213 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-rootfs\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399262 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-utilities\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399285 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399310 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399317 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-daemon-config\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399330 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399368 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399399 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9g5b\" (UniqueName: \"kubernetes.io/projected/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-kube-api-access-x9g5b\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399419 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399438 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399444 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24bd6a9b-9934-4cf9-b591-1ab892c8014c-utilities\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399617 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/367ea4e8-8172-4730-8f26-499ed7c167bb-serving-cert\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399701 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399742 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399778 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399829 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399864 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5f65\" (UniqueName: \"kubernetes.io/projected/fbde1a69-b56c-467e-a291-9a83c448346f-kube-api-access-j5f65\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399894 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.399975 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-client\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400016 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-wtmp\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400026 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f479b3-45ff-43f9-ae85-083554a54918-marketplace-operator-metrics\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400052 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400084 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400111 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400121 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84e5577e-df8b-46a2-aff7-0bf90b5010d9-config\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400141 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9zjb\" (UniqueName: \"kubernetes.io/projected/f5148a50-4332-4ab4-af63-449ef7f16333-kube-api-access-p9zjb\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400149 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bea42722-d7af-4938-9f3a-da57bd056f96-serving-cert\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400173 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400200 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rztqj\" (UniqueName: \"kubernetes.io/projected/0f2278c1-05cf-469c-a13c-762b7a790b38-kube-api-access-rztqj\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400215 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/131e2573-ef74-4963-bdcf-901310e7a486-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400227 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400259 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt2h\" (UniqueName: \"kubernetes.io/projected/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-kube-api-access-pxt2h\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400281 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/367ea4e8-8172-4730-8f26-499ed7c167bb-config\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400285 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400329 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400360 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400394 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400431 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-tuned\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400464 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbbm\" (UniqueName: \"kubernetes.io/projected/003e840c-1b07-4624-8225-e4ffdab74213-kube-api-access-hmbbm\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400494 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400497 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-tuned\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400518 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400538 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bea42722-d7af-4938-9f3a-da57bd056f96-config\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400547 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400578 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400612 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400619 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400651 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400678 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400703 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-encryption-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400755 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aab47a39-a18b-4350-b141-4d5de2d8e96f-etcd-client\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400754 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-default-certificate\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400791 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/78d3b2df-3d81-413e-b039-80dc20111eae-apiservice-cert\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400799 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400836 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-catalog-content\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400795 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-package-server-manager-serving-cert\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400865 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400901 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400912 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/003e840c-1b07-4624-8225-e4ffdab74213-catalog-content\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400923 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-systemd\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400945 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400965 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.400984 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401004 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-audit-dir\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401108 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401117 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4705c8f3-89d4-42f2-ad14-e591c5380b9e-available-featuregates\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401128 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cz7\" (UniqueName: \"kubernetes.io/projected/5f2d1d03-7427-45a0-a917-173bf9df9902-kube-api-access-z7cz7\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401213 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401217 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/162b513f-2f67-4946-816b-48f6232dfca0-service-ca-bundle\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401235 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401258 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401276 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401295 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401314 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401335 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401356 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401372 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401392 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401410 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705c8f3-89d4-42f2-ad14-e591c5380b9e-serving-cert\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401476 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401578 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/e90c10f6-36d9-44bd-a4c6-174f12135434-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401583 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.402517 master-0 kubenswrapper[29612]: I0319 12:10:24.401663 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/60081e3b-4fe5-42a3-bfae-0b07853c0e36-cache\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401697 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401728 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401756 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401782 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401789 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/60081e3b-4fe5-42a3-bfae-0b07853c0e36-cache\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401808 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9hnq\" (UniqueName: \"kubernetes.io/projected/09282e98-3140-41c9-863f-67e0e9f942cc-kube-api-access-x9hnq\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401833 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401855 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-root\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401877 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401899 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401925 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401947 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-var-lib-kubelet\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401972 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.401998 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.402033 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.402060 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.402096 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn94k\" (UniqueName: \"kubernetes.io/projected/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-kube-api-access-rn94k\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.402137 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.402405 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab47a39-a18b-4350-b141-4d5de2d8e96f-config\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:24.407809 master-0 kubenswrapper[29612]: I0319 12:10:24.402538 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c479beae-6d34-4d49-81c8-39f6a53ff6ca-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.415579 master-0 kubenswrapper[29612]: I0319 12:10:24.415527 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 12:10:24.423067 master-0 kubenswrapper[29612]: I0319 12:10:24.423019 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-serving-ca\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.434721 master-0 kubenswrapper[29612]: I0319 12:10:24.434668 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 12:10:24.438285 master-0 kubenswrapper[29612]: I0319 12:10:24.438190 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-etcd-client\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.455026 master-0 kubenswrapper[29612]: I0319 12:10:24.454973 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 12:10:24.462508 master-0 kubenswrapper[29612]: I0319 12:10:24.461989 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7af28c70-3ee0-438b-8655-95bc57deaa05-encryption-config\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.474752 master-0 kubenswrapper[29612]: I0319 12:10:24.474704 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 12:10:24.481975 master-0 kubenswrapper[29612]: I0319 12:10:24.481911 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4e7b04dc-e85d-41b8-959c-87b1b24731a1-signing-cabundle\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 12:10:24.495707 master-0 kubenswrapper[29612]: I0319 12:10:24.495608 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 12:10:24.497771 master-0 kubenswrapper[29612]: I0319 12:10:24.497610 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-env-overrides\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.503044 master-0 kubenswrapper[29612]: I0319 12:10:24.502996 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.503141 master-0 kubenswrapper[29612]: I0319 12:10:24.503094 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-systemd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.503281 master-0 kubenswrapper[29612]: I0319 12:10:24.503216 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.503281 master-0 kubenswrapper[29612]: I0319 12:10:24.503274 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.503358 master-0 kubenswrapper[29612]: I0319 12:10:24.503318 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.503391 master-0 kubenswrapper[29612]: I0319 12:10:24.503368 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-netns\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.503435 master-0 kubenswrapper[29612]: I0319 12:10:24.503414 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-conf-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.503470 master-0 kubenswrapper[29612]: I0319 12:10:24.503434 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.503503 master-0 kubenswrapper[29612]: I0319 12:10:24.503453 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-data-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.503503 master-0 kubenswrapper[29612]: I0319 12:10:24.503485 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-root\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.503503 master-0 kubenswrapper[29612]: I0319 12:10:24.503471 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-root\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.503503 master-0 kubenswrapper[29612]: I0319 12:10:24.503513 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.503661 master-0 kubenswrapper[29612]: I0319 12:10:24.503540 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-var-lib-kubelet\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504043 master-0 kubenswrapper[29612]: I0319 12:10:24.503843 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-node-pullsecrets\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.504043 master-0 kubenswrapper[29612]: I0319 12:10:24.503912 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.504043 master-0 kubenswrapper[29612]: I0319 12:10:24.503942 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.504043 master-0 kubenswrapper[29612]: I0319 12:10:24.503972 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-node-pullsecrets\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.504043 master-0 kubenswrapper[29612]: I0319 12:10:24.503991 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.504043 master-0 kubenswrapper[29612]: I0319 12:10:24.504040 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504047 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504084 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504119 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504138 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504085 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-os-release\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504191 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504200 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-var-lib-kubelet\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504244 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.504277 master-0 kubenswrapper[29612]: I0319 12:10:24.504274 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-bin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504284 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504321 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504355 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysconfig\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504402 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504399 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504444 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504459 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-docker\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504469 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysconfig\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504474 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5148a50-4332-4ab4-af63-449ef7f16333-hosts-file\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504501 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504504 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f5148a50-4332-4ab4-af63-449ef7f16333-hosts-file\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504543 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.504571 master-0 kubenswrapper[29612]: I0319 12:10:24.504564 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-bin\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504595 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-resource-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504583 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-host\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504657 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-host\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504662 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-sys\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504694 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-sys\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504739 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-sys\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504766 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504815 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504841 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-sys\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504864 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504910 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.504980 master-0 kubenswrapper[29612]: I0319 12:10:24.504913 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505004 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505030 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-dir\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505032 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-multus-socket-dir-parent\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505077 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-dir\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505078 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-log-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505093 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505157 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505196 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-ssl-certs\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505224 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e27b7d086edf5d2cf47b703574641d8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"8e27b7d086edf5d2cf47b703574641d8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505261 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-modprobe-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505282 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505333 master-0 kubenswrapper[29612]: I0319 12:10:24.505300 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505360 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505370 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-log-socket\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505438 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505466 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-modprobe-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505485 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-d\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505490 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505511 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505544 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-conf\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505562 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-run\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505606 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-run\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505622 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505627 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505678 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505734 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-sysctl-conf\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505746 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505764 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505781 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-ovn\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.505778 master-0 kubenswrapper[29612]: I0319 12:10:24.505678 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-systemd-units\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.505841 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.505858 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-hostroot\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.505890 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.505930 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.505983 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506011 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506053 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506079 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-etc-kubernetes\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506099 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-os-release\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506109 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506122 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-kubelet\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506150 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-node-log\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506160 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-containers\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506252 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.506335 master-0 kubenswrapper[29612]: I0319 12:10:24.506287 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506414 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-kubernetes\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506481 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506500 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506525 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-lib-modules\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506558 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506580 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506606 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506694 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506770 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506784 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-usr-local-bin\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.506808 master-0 kubenswrapper[29612]: I0319 12:10:24.506791 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506829 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-multus-certs\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506845 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-cert-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506870 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-kubernetes\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506882 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b0133cb5-e425-40af-a051-71b2362a89c3-host-etc-kube\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506908 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-etc-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506921 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506952 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/64136215-8539-4349-977f-6d59deb132ea-host-slash\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506956 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/094204df314fe45bd5af12ca1b4622bb-static-pod-dir\") pod \"etcd-master-0\" (UID: \"094204df314fe45bd5af12ca1b4622bb\") " pod="openshift-etcd/etcd-master-0" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.506968 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.507007 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f65be54c-d851-4daa-93a0-8a3947da5e26-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.507079 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-lib-modules\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.507128 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507232 master-0 kubenswrapper[29612]: I0319 12:10:24.507186 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-var-lib-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507605 master-0 kubenswrapper[29612]: I0319 12:10:24.507290 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507605 master-0 kubenswrapper[29612]: I0319 12:10:24.507321 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.507605 master-0 kubenswrapper[29612]: I0319 12:10:24.507521 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-rootfs\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:24.507605 master-0 kubenswrapper[29612]: I0319 12:10:24.507550 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.507605 master-0 kubenswrapper[29612]: I0319 12:10:24.507578 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507605 master-0 kubenswrapper[29612]: I0319 12:10:24.507598 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507609 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/60081e3b-4fe5-42a3-bfae-0b07853c0e36-etc-docker\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507667 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-cnibin\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507717 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-wtmp\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507730 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-rootfs\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507493 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-run-netns\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507790 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-system-cni-dir\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507795 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-run-openvswitch\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507794 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-wtmp\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:24.507825 master-0 kubenswrapper[29612]: I0319 12:10:24.507831 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.507880 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1249822f86f23526277d165c0d5d3c19-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"1249822f86f23526277d165c0d5d3c19\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.507956 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.507985 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.508033 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.508044 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.508062 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-cni-multus\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.508069 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.508091 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-slash\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.508096 master-0 kubenswrapper[29612]: I0319 12:10:24.508100 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508115 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-cnibin\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508119 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508149 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-run-k8s-cni-cncf-io\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508172 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c479beae-6d34-4d49-81c8-39f6a53ff6ca-system-cni-dir\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508212 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508226 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-host-var-lib-kubelet\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508280 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-systemd\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508303 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-audit-dir\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508342 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508363 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508370 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07eef031-edcc-415c-91d7-f9f3bd0a45e3-etc-systemd\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508386 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508397 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e617e515-a899-41e8-9961-df68e49da4d3-host-cni-netd\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508428 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7af28c70-3ee0-438b-8655-95bc57deaa05-audit-dir\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508491 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:24.511155 master-0 kubenswrapper[29612]: I0319 12:10:24.508519 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/58a4e3af-c5e1-4487-92a2-81dfb696695e-etc-containers\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:24.515980 master-0 kubenswrapper[29612]: I0319 12:10:24.515929 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 12:10:24.522217 master-0 kubenswrapper[29612]: I0319 12:10:24.522163 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51e2c1d1-a50f-4a0d-8273-440e699b3907-ovnkube-identity-cm\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.536249 master-0 kubenswrapper[29612]: I0319 12:10:24.535519 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 12:10:24.547441 master-0 kubenswrapper[29612]: I0319 12:10:24.547363 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51e2c1d1-a50f-4a0d-8273-440e699b3907-webhook-cert\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:24.557739 master-0 kubenswrapper[29612]: I0319 12:10:24.555141 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 12:10:24.575897 master-0 kubenswrapper[29612]: I0319 12:10:24.575829 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 12:10:24.576682 master-0 kubenswrapper[29612]: I0319 12:10:24.576441 29612 scope.go:117] "RemoveContainer" containerID="79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b" Mar 19 12:10:24.578792 master-0 kubenswrapper[29612]: I0319 12:10:24.578654 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-metrics-certs\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 12:10:24.593915 master-0 kubenswrapper[29612]: I0319 12:10:24.591346 29612 scope.go:117] "RemoveContainer" containerID="76d14e466053de21cd4b44b01526664874f27eff3ad2faef8933296d68c35040" Mar 19 12:10:24.603570 master-0 kubenswrapper[29612]: I0319 12:10:24.603164 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 12:10:24.614615 master-0 kubenswrapper[29612]: I0319 12:10:24.614574 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 12:10:24.636366 master-0 kubenswrapper[29612]: I0319 12:10:24.636174 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 12:10:24.654833 master-0 kubenswrapper[29612]: I0319 12:10:24.654789 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 12:10:24.661339 master-0 kubenswrapper[29612]: I0319 12:10:24.661305 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e617e515-a899-41e8-9961-df68e49da4d3-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.676521 master-0 kubenswrapper[29612]: I0319 12:10:24.676464 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 12:10:24.683500 master-0 kubenswrapper[29612]: I0319 12:10:24.683249 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovnkube-config\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.683500 master-0 kubenswrapper[29612]: I0319 12:10:24.683378 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-config\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.695211 master-0 kubenswrapper[29612]: I0319 12:10:24.695164 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 12:10:24.715289 master-0 kubenswrapper[29612]: I0319 12:10:24.715183 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 12:10:24.723039 master-0 kubenswrapper[29612]: I0319 12:10:24.722987 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/64136215-8539-4349-977f-6d59deb132ea-iptables-alerter-script\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:24.740715 master-0 kubenswrapper[29612]: I0319 12:10:24.740663 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 12:10:24.748761 master-0 kubenswrapper[29612]: I0319 12:10:24.748718 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-env-overrides\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.749199 master-0 kubenswrapper[29612]: I0319 12:10:24.749156 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-env-overrides\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.754982 master-0 kubenswrapper[29612]: I0319 12:10:24.754943 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 12:10:24.782399 master-0 kubenswrapper[29612]: I0319 12:10:24.780753 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 12:10:24.788255 master-0 kubenswrapper[29612]: I0319 12:10:24.788208 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:24.796348 master-0 kubenswrapper[29612]: I0319 12:10:24.796304 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 12:10:24.802784 master-0 kubenswrapper[29612]: I0319 12:10:24.802748 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e617e515-a899-41e8-9961-df68e49da4d3-ovnkube-script-lib\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:24.814572 master-0 kubenswrapper[29612]: I0319 12:10:24.814542 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 12:10:24.819953 master-0 kubenswrapper[29612]: I0319 12:10:24.819909 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-trusted-ca-bundle\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.834555 master-0 kubenswrapper[29612]: I0319 12:10:24.834512 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 12:10:24.841382 master-0 kubenswrapper[29612]: I0319 12:10:24.841332 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-client\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.854754 master-0 kubenswrapper[29612]: I0319 12:10:24.854693 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 12:10:24.863136 master-0 kubenswrapper[29612]: I0319 12:10:24.863103 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-serving-cert\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.883035 master-0 kubenswrapper[29612]: I0319 12:10:24.883001 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 12:10:24.892331 master-0 kubenswrapper[29612]: I0319 12:10:24.892296 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-encryption-config\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.896270 master-0 kubenswrapper[29612]: I0319 12:10:24.896202 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 12:10:24.916923 master-0 kubenswrapper[29612]: I0319 12:10:24.916825 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 12:10:24.935266 master-0 kubenswrapper[29612]: I0319 12:10:24.935210 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 12:10:24.945577 master-0 kubenswrapper[29612]: I0319 12:10:24.945495 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-audit-policies\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.956330 master-0 kubenswrapper[29612]: I0319 12:10:24.956272 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 12:10:24.964307 master-0 kubenswrapper[29612]: I0319 12:10:24.964250 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-etcd-serving-ca\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:24.975803 master-0 kubenswrapper[29612]: I0319 12:10:24.975679 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 12:10:24.996187 master-0 kubenswrapper[29612]: I0319 12:10:24.996130 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 12:10:25.015584 master-0 kubenswrapper[29612]: I0319 12:10:25.015530 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 12:10:25.023820 master-0 kubenswrapper[29612]: I0319 12:10:25.023772 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/58a4e3af-c5e1-4487-92a2-81dfb696695e-catalogserver-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:25.043145 master-0 kubenswrapper[29612]: I0319 12:10:25.043076 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 12:10:25.055439 master-0 kubenswrapper[29612]: I0319 12:10:25.055387 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 12:10:25.066159 master-0 kubenswrapper[29612]: I0319 12:10:25.066099 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-ca-certs\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:25.079022 master-0 kubenswrapper[29612]: I0319 12:10:25.078978 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 12:10:25.095328 master-0 kubenswrapper[29612]: I0319 12:10:25.094711 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 12:10:25.099710 master-0 kubenswrapper[29612]: I0319 12:10:25.099660 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-ca-certs\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:25.115984 master-0 kubenswrapper[29612]: I0319 12:10:25.115609 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 12:10:25.660045 master-0 kubenswrapper[29612]: E0319 12:10:25.659970 29612 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.660619 master-0 kubenswrapper[29612]: E0319 12:10:25.660196 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.660619 master-0 kubenswrapper[29612]: E0319 12:10:25.660250 29612 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.660619 master-0 kubenswrapper[29612]: E0319 12:10:25.660286 29612 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.660619 master-0 kubenswrapper[29612]: E0319 12:10:25.660362 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.660619 master-0 kubenswrapper[29612]: E0319 12:10:25.660403 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.661057 master-0 kubenswrapper[29612]: E0319 12:10:25.660984 29612 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.661427 master-0 kubenswrapper[29612]: E0319 12:10:25.661402 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.661583 master-0 kubenswrapper[29612]: E0319 12:10:25.661551 29612 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.661649 master-0 kubenswrapper[29612]: E0319 12:10:25.661608 29612 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.662149 master-0 kubenswrapper[29612]: E0319 12:10:25.662123 29612 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.663123 master-0 kubenswrapper[29612]: I0319 12:10:25.663099 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 12:10:25.663603 master-0 kubenswrapper[29612]: E0319 12:10:25.663583 29612 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.663959 master-0 kubenswrapper[29612]: E0319 12:10:25.663934 29612 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.664058 master-0 kubenswrapper[29612]: E0319 12:10:25.664039 29612 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.664158 master-0 kubenswrapper[29612]: E0319 12:10:25.664132 29612 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.664267 master-0 kubenswrapper[29612]: E0319 12:10:25.664248 29612 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.664429 master-0 kubenswrapper[29612]: E0319 12:10:25.664408 29612 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.664732 master-0 kubenswrapper[29612]: E0319 12:10:25.664704 29612 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.664799 master-0 kubenswrapper[29612]: E0319 12:10:25.664768 29612 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.664844 master-0 kubenswrapper[29612]: E0319 12:10:25.664827 29612 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.664940 master-0 kubenswrapper[29612]: E0319 12:10:25.664918 29612 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.665015 master-0 kubenswrapper[29612]: E0319 12:10:25.664974 29612 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.665179 master-0 kubenswrapper[29612]: I0319 12:10:25.665155 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 12:10:25.665247 master-0 kubenswrapper[29612]: E0319 12:10:25.665209 29612 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.665295 master-0 kubenswrapper[29612]: E0319 12:10:25.665268 29612 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.665342 master-0 kubenswrapper[29612]: E0319 12:10:25.665315 29612 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.665402 master-0 kubenswrapper[29612]: E0319 12:10:25.665370 29612 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.665450 master-0 kubenswrapper[29612]: E0319 12:10:25.665427 29612 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.665587 master-0 kubenswrapper[29612]: I0319 12:10:25.665565 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 12:10:25.665760 master-0 kubenswrapper[29612]: E0319 12:10:25.665593 29612 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.665760 master-0 kubenswrapper[29612]: E0319 12:10:25.665696 29612 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.665855 master-0 kubenswrapper[29612]: E0319 12:10:25.665782 29612 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.665855 master-0 kubenswrapper[29612]: I0319 12:10:25.665850 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 12:10:25.665938 master-0 kubenswrapper[29612]: E0319 12:10:25.665886 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.666459 master-0 kubenswrapper[29612]: E0319 12:10:25.660301 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert podName:f65be54c-d851-4daa-93a0-8a3947da5e26 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.160279633 +0000 UTC m=+33.474448654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert") pod "cluster-version-operator-7d58488df-4fpvq" (UID: "f65be54c-d851-4daa-93a0-8a3947da5e26") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.666459 master-0 kubenswrapper[29612]: E0319 12:10:25.664138 29612 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.666459 master-0 kubenswrapper[29612]: E0319 12:10:25.660335 29612 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.666459 master-0 kubenswrapper[29612]: E0319 12:10:25.660319 29612 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.666624 master-0 kubenswrapper[29612]: E0319 12:10:25.666471 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert podName:a5d009a3-5ff2-49f9-9cea-004b99729b77 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166434169 +0000 UTC m=+33.480603200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-7d87854d6-qzxkm" (UID: "a5d009a3-5ff2-49f9-9cea-004b99729b77") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.666624 master-0 kubenswrapper[29612]: E0319 12:10:25.666507 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls podName:09282e98-3140-41c9-863f-67e0e9f942cc nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.16649472 +0000 UTC m=+33.480663741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls") pod "openshift-state-metrics-5dc6c74576-npqhj" (UID: "09282e98-3140-41c9-863f-67e0e9f942cc") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666872 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls podName:eb274dd5-e98e-4485-a050-7bec560a0daa nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166859431 +0000 UTC m=+33.481028452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls") pod "machine-config-operator-84d549f6d5-ndc5t" (UID: "eb274dd5-e98e-4485-a050-7bec560a0daa") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666898 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166889722 +0000 UTC m=+33.481058743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666927 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca podName:b92c4c09-a796-4e00-821a-32fc9c8ca214 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166910562 +0000 UTC m=+33.481079583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca") pod "prometheus-operator-6c8df6d4b-tbjxv" (UID: "b92c4c09-a796-4e00-821a-32fc9c8ca214") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666943 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert podName:8b4a2e47-e85a-4474-a1b7-3eb85283dfcf nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166935123 +0000 UTC m=+33.481104144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert") pod "insights-operator-68bf6ff9d6-jn2px" (UID: "8b4a2e47-e85a-4474-a1b7-3eb85283dfcf") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666963 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs podName:bcc25a4a-521a-4139-a10c-d5b8bba6e359 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166955524 +0000 UTC m=+33.481124545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs") pod "multus-admission-controller-58c9f8fc64-vc6n8" (UID: "bcc25a4a-521a-4139-a10c-d5b8bba6e359") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666981 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166974114 +0000 UTC m=+33.481143135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.666996 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls podName:d52981e0-0bc4-4dfa-81d5-aeab63fa23e0 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.166988514 +0000 UTC m=+33.481157535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-6f97756bc8-9fvr8" (UID: "d52981e0-0bc4-4dfa-81d5-aeab63fa23e0") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667021 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert podName:63e81c6a-b723-4c9a-b471-226fefa9c7a7 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167003865 +0000 UTC m=+33.481172886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-744f9dbf77-9r2n5" (UID: "63e81c6a-b723-4c9a-b471-226fefa9c7a7") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667038 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images podName:eb274dd5-e98e-4485-a050-7bec560a0daa nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167030736 +0000 UTC m=+33.481199757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images") pod "machine-config-operator-84d549f6d5-ndc5t" (UID: "eb274dd5-e98e-4485-a050-7bec560a0daa") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667053 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token podName:a49a8a5e-23fb-46fb-b184-e09da4817028 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167046506 +0000 UTC m=+33.481215527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token") pod "machine-config-server-x66j8" (UID: "a49a8a5e-23fb-46fb-b184-e09da4817028") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667075 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config podName:1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167066707 +0000 UTC m=+33.481235728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config") pod "machine-config-controller-b4f87c5b9-5sh6j" (UID: "1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667089 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config podName:81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167082317 +0000 UTC m=+33.481251338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config") pod "machine-approver-5c6485487f-b2w5m" (UID: "81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667112 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config podName:81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167097158 +0000 UTC m=+33.481266179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config") pod "machine-approver-5c6485487f-b2w5m" (UID: "81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667130 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert podName:819c8a76-019b-4124-a67f-fb5838cbec5d nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167119918 +0000 UTC m=+33.481288939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert") pod "controller-manager-79589db566-gb7tq" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667149 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config podName:819c8a76-019b-4124-a67f-fb5838cbec5d nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167139269 +0000 UTC m=+33.481308290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config") pod "controller-manager-79589db566-gb7tq" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667179 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config podName:e47d3935-d96e-4fd2-abfc-99bff7eac5e0 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.16717111 +0000 UTC m=+33.481340131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7dff898856-8pnq2" (UID: "e47d3935-d96e-4fd2-abfc-99bff7eac5e0") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667203 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.16719004 +0000 UTC m=+33.481359061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667217 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images podName:fbde1a69-b56c-467e-a291-9a83c448346f nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167211051 +0000 UTC m=+33.481380072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images") pod "machine-api-operator-6fbb6cf6f9-c99xj" (UID: "fbde1a69-b56c-467e-a291-9a83c448346f") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667232 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167224071 +0000 UTC m=+33.481393092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667247 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca podName:63e81c6a-b723-4c9a-b471-226fefa9c7a7 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167238182 +0000 UTC m=+33.481407203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca") pod "cloud-credential-operator-744f9dbf77-9r2n5" (UID: "63e81c6a-b723-4c9a-b471-226fefa9c7a7") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667260 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates podName:8117f6b0-0b9f-4a1d-9807-bf73c19f388b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167252202 +0000 UTC m=+33.481421223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates") pod "prometheus-operator-admission-webhook-69c6b55594-52ww6" (UID: "8117f6b0-0b9f-4a1d-9807-bf73c19f388b") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667274 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config podName:0f2278c1-05cf-469c-a13c-762b7a790b38 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167266022 +0000 UTC m=+33.481435033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config") pod "cluster-autoscaler-operator-866dc4744-l2whx" (UID: "0f2278c1-05cf-469c-a13c-762b7a790b38") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667286 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls podName:a5cf6404-1640-43d3-8018-03c62c8338c5 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167280273 +0000 UTC m=+33.481449294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls") pod "dns-default-pgl92" (UID: "a5cf6404-1640-43d3-8018-03c62c8338c5") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667302 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle podName:8b4a2e47-e85a-4474-a1b7-3eb85283dfcf nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167293023 +0000 UTC m=+33.481462044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle") pod "insights-operator-68bf6ff9d6-jn2px" (UID: "8b4a2e47-e85a-4474-a1b7-3eb85283dfcf") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667324 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle podName:8b4a2e47-e85a-4474-a1b7-3eb85283dfcf nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167310564 +0000 UTC m=+33.481479585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle") pod "insights-operator-68bf6ff9d6-jn2px" (UID: "8b4a2e47-e85a-4474-a1b7-3eb85283dfcf") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667339 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167332044 +0000 UTC m=+33.481501065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667354 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles podName:819c8a76-019b-4124-a67f-fb5838cbec5d nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167344945 +0000 UTC m=+33.481513966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles") pod "controller-manager-79589db566-gb7tq" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667369 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert podName:12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167360185 +0000 UTC m=+33.481529206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert") pod "route-controller-manager-6bcb8645f8-5z6xt" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667384 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167376546 +0000 UTC m=+33.481545567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667405 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs podName:46108693-3c9e-4804-beb0-d6b8f8df7625 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167391346 +0000 UTC m=+33.481560367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs") pod "router-default-7dcf5569b5-47tqf" (UID: "46108693-3c9e-4804-beb0-d6b8f8df7625") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667420 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca podName:09282e98-3140-41c9-863f-67e0e9f942cc nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167411997 +0000 UTC m=+33.481581018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca") pod "openshift-state-metrics-5dc6c74576-npqhj" (UID: "09282e98-3140-41c9-863f-67e0e9f942cc") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.660343 29612 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667467 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls podName:5f2d1d03-7427-45a0-a917-173bf9df9902 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167460518 +0000 UTC m=+33.481629539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls") pod "node-exporter-d48mb" (UID: "5f2d1d03-7427-45a0-a917-173bf9df9902") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: I0319 12:10:25.667510 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: I0319 12:10:25.667723 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667849 29612 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667908 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls podName:e47d3935-d96e-4fd2-abfc-99bff7eac5e0 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.16788458 +0000 UTC m=+33.482053601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7dff898856-8pnq2" (UID: "e47d3935-d96e-4fd2-abfc-99bff7eac5e0") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667925 29612 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667935 29612 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667971 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls podName:3a185e40-e97b-49e9-aea3-5f170401a9e3 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167961382 +0000 UTC m=+33.482130393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls") pod "cluster-samples-operator-85f7577d78-rbshp" (UID: "3a185e40-e97b-49e9-aea3-5f170401a9e3") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667992 29612 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.667996 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config podName:b92c4c09-a796-4e00-821a-32fc9c8ca214 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.167981213 +0000 UTC m=+33.482150234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-6c8df6d4b-tbjxv" (UID: "b92c4c09-a796-4e00-821a-32fc9c8ca214") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.668039 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config podName:12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.168030044 +0000 UTC m=+33.482199065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config") pod "route-controller-manager-6bcb8645f8-5z6xt" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.668804 29612 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.668859 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config podName:eb274dd5-e98e-4485-a050-7bec560a0daa nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.168849258 +0000 UTC m=+33.483018269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config") pod "machine-config-operator-84d549f6d5-ndc5t" (UID: "eb274dd5-e98e-4485-a050-7bec560a0daa") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.668794 master-0 kubenswrapper[29612]: E0319 12:10:25.668865 29612 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: I0319 12:10:25.668885 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-stats-auth\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668896 29612 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668948 29612 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668963 29612 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668989 29612 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669016 29612 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669048 29612 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669084 29612 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669116 29612 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669208 29612 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669018 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668910 29612 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668913 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config podName:5f2d1d03-7427-45a0-a917-173bf9df9902 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.168903399 +0000 UTC m=+33.483072420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config") pod "node-exporter-d48mb" (UID: "5f2d1d03-7427-45a0-a917-173bf9df9902") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669331 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images podName:e47d3935-d96e-4fd2-abfc-99bff7eac5e0 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169317231 +0000 UTC m=+33.483486252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images") pod "cluster-cloud-controller-manager-operator-7dff898856-8pnq2" (UID: "e47d3935-d96e-4fd2-abfc-99bff7eac5e0") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669350 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca podName:12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169339802 +0000 UTC m=+33.483508823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca") pod "route-controller-manager-6bcb8645f8-5z6xt" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669378 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume podName:a5cf6404-1640-43d3-8018-03c62c8338c5 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169359692 +0000 UTC m=+33.483528713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume") pod "dns-default-pgl92" (UID: "a5cf6404-1640-43d3-8018-03c62c8338c5") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669396 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls podName:b92c4c09-a796-4e00-821a-32fc9c8ca214 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169387033 +0000 UTC m=+33.483556054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls") pod "prometheus-operator-6c8df6d4b-tbjxv" (UID: "b92c4c09-a796-4e00-821a-32fc9c8ca214") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669421 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls podName:1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169409684 +0000 UTC m=+33.483578705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls") pod "machine-config-controller-b4f87c5b9-5sh6j" (UID: "1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669449 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca podName:819c8a76-019b-4124-a67f-fb5838cbec5d nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169441334 +0000 UTC m=+33.483610365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca") pod "controller-manager-79589db566-gb7tq" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669470 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle podName:46108693-3c9e-4804-beb0-d6b8f8df7625 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169459695 +0000 UTC m=+33.483628716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle") pod "router-default-7dcf5569b5-47tqf" (UID: "46108693-3c9e-4804-beb0-d6b8f8df7625") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669499 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert podName:57d19c68-b469-4a6f-abe0-3a0eff32639c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169478535 +0000 UTC m=+33.483647556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert") pod "ingress-canary-tzfl9" (UID: "57d19c68-b469-4a6f-abe0-3a0eff32639c") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669518 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config podName:8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169508736 +0000 UTC m=+33.483677757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config") pod "machine-config-daemon-tmgtz" (UID: "8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669532 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169526107 +0000 UTC m=+33.483695128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669552 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config podName:fbde1a69-b56c-467e-a291-9a83c448346f nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169543787 +0000 UTC m=+33.483712808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config") pod "machine-api-operator-6fbb6cf6f9-c99xj" (UID: "fbde1a69-b56c-467e-a291-9a83c448346f") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.668941 29612 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.670821 master-0 kubenswrapper[29612]: E0319 12:10:25.669601 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca podName:f65be54c-d851-4daa-93a0-8a3947da5e26 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.169593419 +0000 UTC m=+33.483762440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca") pod "cluster-version-operator-7d58488df-4fpvq" (UID: "f65be54c-d851-4daa-93a0-8a3947da5e26") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.672763 29612 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.672844 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert podName:b69a314d-4884-4f2c-a48e-1cf68c5aaef4 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.172818581 +0000 UTC m=+33.486987612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert") pod "packageserver-86c8b5dddf-j95tg" (UID: "b69a314d-4884-4f2c-a48e-1cf68c5aaef4") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.673230 29612 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.673292 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert podName:b69a314d-4884-4f2c-a48e-1cf68c5aaef4 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.173280544 +0000 UTC m=+33.487449565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert") pod "packageserver-86c8b5dddf-j95tg" (UID: "b69a314d-4884-4f2c-a48e-1cf68c5aaef4") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.673333 29612 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.673373 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls podName:fbde1a69-b56c-467e-a291-9a83c448346f nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.173365066 +0000 UTC m=+33.487534087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls") pod "machine-api-operator-6fbb6cf6f9-c99xj" (UID: "fbde1a69-b56c-467e-a291-9a83c448346f") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: I0319 12:10:25.673553 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: I0319 12:10:25.673629 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-d7x5g" Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674199 29612 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674365 29612 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: I0319 12:10:25.674401 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-default-certificate\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674368 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls podName:8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.174326624 +0000 UTC m=+33.488495645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls") pod "machine-config-daemon-tmgtz" (UID: "8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674464 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config podName:09282e98-3140-41c9-863f-67e0e9f942cc nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.174450367 +0000 UTC m=+33.488619388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5dc6c74576-npqhj" (UID: "09282e98-3140-41c9-863f-67e0e9f942cc") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674488 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674491 29612 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674496 29612 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674551 29612 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.674662 29612 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-atris4eemsst: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.675043 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.175027564 +0000 UTC m=+33.489196585 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.675107 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert podName:0f2278c1-05cf-469c-a13c-762b7a790b38 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.175082075 +0000 UTC m=+33.489251096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert") pod "cluster-autoscaler-operator-866dc4744-l2whx" (UID: "0f2278c1-05cf-469c-a13c-762b7a790b38") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.675186 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca podName:5f2d1d03-7427-45a0-a917-173bf9df9902 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.175174848 +0000 UTC m=+33.489343859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca") pod "node-exporter-d48mb" (UID: "5f2d1d03-7427-45a0-a917-173bf9df9902") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:25.675660 master-0 kubenswrapper[29612]: E0319 12:10:25.675242 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.17523259 +0000 UTC m=+33.489401611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.676693 master-0 kubenswrapper[29612]: E0319 12:10:25.675951 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs podName:a49a8a5e-23fb-46fb-b184-e09da4817028 nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.17593837 +0000 UTC m=+33.490107391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs") pod "machine-config-server-x66j8" (UID: "a49a8a5e-23fb-46fb-b184-e09da4817028") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.676693 master-0 kubenswrapper[29612]: E0319 12:10:25.676016 29612 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.676693 master-0 kubenswrapper[29612]: I0319 12:10:25.676120 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:10:25.676693 master-0 kubenswrapper[29612]: E0319 12:10:25.676285 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls podName:81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a nodeName:}" failed. No retries permitted until 2026-03-19 12:10:26.176251079 +0000 UTC m=+33.490420090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls") pod "machine-approver-5c6485487f-b2w5m" (UID: "81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:25.677620 master-0 kubenswrapper[29612]: I0319 12:10:25.677140 29612 request.go:700] Waited for 1.399296321s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-version/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 19 12:10:25.678929 master-0 kubenswrapper[29612]: I0319 12:10:25.678699 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:10:25.680525 master-0 kubenswrapper[29612]: I0319 12:10:25.679179 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 12:10:25.680525 master-0 kubenswrapper[29612]: I0319 12:10:25.679531 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4lzgv" Mar 19 12:10:25.680706 master-0 kubenswrapper[29612]: I0319 12:10:25.680684 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:10:25.681397 master-0 kubenswrapper[29612]: I0319 12:10:25.681095 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:10:25.681397 master-0 kubenswrapper[29612]: I0319 12:10:25.681124 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 12:10:25.681397 master-0 kubenswrapper[29612]: I0319 12:10:25.681232 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 12:10:25.681397 master-0 kubenswrapper[29612]: I0319 12:10:25.681393 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:10:25.681723 master-0 kubenswrapper[29612]: I0319 12:10:25.681685 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.681822 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-62ztt" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.681954 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.682365 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.683077 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.684130 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.684184 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:10:25.684521 master-0 kubenswrapper[29612]: I0319 12:10:25.684382 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:10:25.690740 master-0 kubenswrapper[29612]: I0319 12:10:25.687763 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:10:25.690740 master-0 kubenswrapper[29612]: I0319 12:10:25.687959 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:10:25.690740 master-0 kubenswrapper[29612]: I0319 12:10:25.690685 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:10:25.695067 master-0 kubenswrapper[29612]: I0319 12:10:25.695008 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bgrvv" Mar 19 12:10:25.715264 master-0 kubenswrapper[29612]: I0319 12:10:25.715196 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-xsv5c" Mar 19 12:10:25.755989 master-0 kubenswrapper[29612]: I0319 12:10:25.755896 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7v2tf" Mar 19 12:10:25.776932 master-0 kubenswrapper[29612]: I0319 12:10:25.776818 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 12:10:25.797006 master-0 kubenswrapper[29612]: I0319 12:10:25.796908 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 12:10:25.816070 master-0 kubenswrapper[29612]: I0319 12:10:25.815956 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-k64d2" Mar 19 12:10:25.836387 master-0 kubenswrapper[29612]: I0319 12:10:25.836301 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-tsgvs" Mar 19 12:10:25.856214 master-0 kubenswrapper[29612]: I0319 12:10:25.856151 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 12:10:25.876015 master-0 kubenswrapper[29612]: I0319 12:10:25.875948 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 12:10:25.895750 master-0 kubenswrapper[29612]: I0319 12:10:25.895682 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xm92n" Mar 19 12:10:25.922329 master-0 kubenswrapper[29612]: I0319 12:10:25.922211 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 12:10:25.935385 master-0 kubenswrapper[29612]: I0319 12:10:25.935336 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 12:10:25.954557 master-0 kubenswrapper[29612]: I0319 12:10:25.954509 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 12:10:25.976305 master-0 kubenswrapper[29612]: I0319 12:10:25.976243 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-gjszs" Mar 19 12:10:25.999852 master-0 kubenswrapper[29612]: I0319 12:10:25.999800 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 12:10:26.016258 master-0 kubenswrapper[29612]: I0319 12:10:26.016138 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 12:10:26.035228 master-0 kubenswrapper[29612]: I0319 12:10:26.035172 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 12:10:26.055237 master-0 kubenswrapper[29612]: I0319 12:10:26.055186 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 12:10:26.076875 master-0 kubenswrapper[29612]: I0319 12:10:26.076833 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-qzhdr" Mar 19 12:10:26.095823 master-0 kubenswrapper[29612]: I0319 12:10:26.095763 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 12:10:26.116050 master-0 kubenswrapper[29612]: I0319 12:10:26.115995 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 12:10:26.135141 master-0 kubenswrapper[29612]: I0319 12:10:26.135071 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 12:10:26.155325 master-0 kubenswrapper[29612]: I0319 12:10:26.155252 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 12:10:26.175199 master-0 kubenswrapper[29612]: I0319 12:10:26.175036 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 12:10:26.178564 master-0 kubenswrapper[29612]: I0319 12:10:26.178517 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.178677 master-0 kubenswrapper[29612]: I0319 12:10:26.178595 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:26.178770 master-0 kubenswrapper[29612]: I0319 12:10:26.178752 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:26.178823 master-0 kubenswrapper[29612]: I0319 12:10:26.178784 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.178946 master-0 kubenswrapper[29612]: I0319 12:10:26.178903 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.179121 master-0 kubenswrapper[29612]: I0319 12:10:26.179075 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:26.179199 master-0 kubenswrapper[29612]: I0319 12:10:26.179174 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 12:10:26.179252 master-0 kubenswrapper[29612]: I0319 12:10:26.179210 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:26.179295 master-0 kubenswrapper[29612]: I0319 12:10:26.179246 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:26.179295 master-0 kubenswrapper[29612]: I0319 12:10:26.179216 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.179506 master-0 kubenswrapper[29612]: I0319 12:10:26.179462 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:26.179556 master-0 kubenswrapper[29612]: I0319 12:10:26.179507 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:26.179556 master-0 kubenswrapper[29612]: I0319 12:10:26.179548 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.179663 master-0 kubenswrapper[29612]: I0319 12:10:26.179574 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:26.179663 master-0 kubenswrapper[29612]: I0319 12:10:26.179605 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:26.179852 master-0 kubenswrapper[29612]: I0319 12:10:26.179718 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:26.179852 master-0 kubenswrapper[29612]: I0319 12:10:26.179751 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:26.179852 master-0 kubenswrapper[29612]: I0319 12:10:26.179789 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:26.179852 master-0 kubenswrapper[29612]: I0319 12:10:26.179819 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:26.179852 master-0 kubenswrapper[29612]: I0319 12:10:26.179848 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:26.180082 master-0 kubenswrapper[29612]: I0319 12:10:26.179859 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:26.180082 master-0 kubenswrapper[29612]: I0319 12:10:26.179871 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:26.180082 master-0 kubenswrapper[29612]: I0319 12:10:26.179941 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 12:10:26.180082 master-0 kubenswrapper[29612]: I0319 12:10:26.179980 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fbde1a69-b56c-467e-a291-9a83c448346f-machine-api-operator-tls\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:26.180312 master-0 kubenswrapper[29612]: I0319 12:10:26.180286 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5d009a3-5ff2-49f9-9cea-004b99729b77-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 12:10:26.180381 master-0 kubenswrapper[29612]: I0319 12:10:26.180321 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:26.180425 master-0 kubenswrapper[29612]: I0319 12:10:26.180344 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-images\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:26.180566 master-0 kubenswrapper[29612]: I0319 12:10:26.180499 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:26.180627 master-0 kubenswrapper[29612]: I0319 12:10:26.180589 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:26.180701 master-0 kubenswrapper[29612]: I0319 12:10:26.180672 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:26.180777 master-0 kubenswrapper[29612]: I0319 12:10:26.180754 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-52ww6\" (UID: \"8117f6b0-0b9f-4a1d-9807-bf73c19f388b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 12:10:26.180839 master-0 kubenswrapper[29612]: I0319 12:10:26.180819 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:26.180888 master-0 kubenswrapper[29612]: I0319 12:10:26.180866 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:26.180937 master-0 kubenswrapper[29612]: I0319 12:10:26.180906 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:26.180937 master-0 kubenswrapper[29612]: I0319 12:10:26.180928 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:26.181025 master-0 kubenswrapper[29612]: I0319 12:10:26.180955 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:26.181025 master-0 kubenswrapper[29612]: I0319 12:10:26.181006 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:26.181098 master-0 kubenswrapper[29612]: I0319 12:10:26.181062 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:26.181144 master-0 kubenswrapper[29612]: I0319 12:10:26.181110 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:26.181186 master-0 kubenswrapper[29612]: I0319 12:10:26.181163 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:26.181229 master-0 kubenswrapper[29612]: I0319 12:10:26.181192 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:26.181229 master-0 kubenswrapper[29612]: I0319 12:10:26.181220 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:26.181305 master-0 kubenswrapper[29612]: I0319 12:10:26.181253 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:26.181305 master-0 kubenswrapper[29612]: I0319 12:10:26.181288 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:26.181373 master-0 kubenswrapper[29612]: I0319 12:10:26.181310 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:26.181373 master-0 kubenswrapper[29612]: I0319 12:10:26.181331 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 12:10:26.181373 master-0 kubenswrapper[29612]: I0319 12:10:26.181370 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 12:10:26.181481 master-0 kubenswrapper[29612]: I0319 12:10:26.181458 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:26.181481 master-0 kubenswrapper[29612]: I0319 12:10:26.181479 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:26.181557 master-0 kubenswrapper[29612]: I0319 12:10:26.181519 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:26.181557 master-0 kubenswrapper[29612]: I0319 12:10:26.181542 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:26.181673 master-0 kubenswrapper[29612]: I0319 12:10:26.181581 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:26.181774 master-0 kubenswrapper[29612]: I0319 12:10:26.181750 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:26.181840 master-0 kubenswrapper[29612]: I0319 12:10:26.181786 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:26.181886 master-0 kubenswrapper[29612]: I0319 12:10:26.181852 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:26.181925 master-0 kubenswrapper[29612]: I0319 12:10:26.181900 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:26.181971 master-0 kubenswrapper[29612]: I0319 12:10:26.181946 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 12:10:26.182019 master-0 kubenswrapper[29612]: I0319 12:10:26.181987 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.182063 master-0 kubenswrapper[29612]: I0319 12:10:26.182041 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:26.182112 master-0 kubenswrapper[29612]: I0319 12:10:26.182090 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:26.182178 master-0 kubenswrapper[29612]: I0319 12:10:26.182146 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:26.182273 master-0 kubenswrapper[29612]: I0319 12:10:26.182238 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:26.182321 master-0 kubenswrapper[29612]: I0319 12:10:26.182289 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:26.182321 master-0 kubenswrapper[29612]: I0319 12:10:26.182316 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:26.182418 master-0 kubenswrapper[29612]: I0319 12:10:26.182335 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:26.182418 master-0 kubenswrapper[29612]: I0319 12:10:26.182388 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:26.182498 master-0 kubenswrapper[29612]: I0319 12:10:26.182472 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:26.182538 master-0 kubenswrapper[29612]: I0319 12:10:26.182514 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:26.182697 master-0 kubenswrapper[29612]: I0319 12:10:26.182660 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:26.182774 master-0 kubenswrapper[29612]: I0319 12:10:26.182728 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:26.182957 master-0 kubenswrapper[29612]: I0319 12:10:26.182906 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.183061 master-0 kubenswrapper[29612]: I0319 12:10:26.183036 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f2278c1-05cf-469c-a13c-762b7a790b38-auth-proxy-config\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:26.183243 master-0 kubenswrapper[29612]: I0319 12:10:26.183222 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65be54c-d851-4daa-93a0-8a3947da5e26-serving-cert\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:26.183347 master-0 kubenswrapper[29612]: I0319 12:10:26.183308 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f65be54c-d851-4daa-93a0-8a3947da5e26-service-ca\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:26.183411 master-0 kubenswrapper[29612]: I0319 12:10:26.183365 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbde1a69-b56c-467e-a291-9a83c448346f-config\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:26.184030 master-0 kubenswrapper[29612]: I0319 12:10:26.183987 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a185e40-e97b-49e9-aea3-5f170401a9e3-samples-operator-tls\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 12:10:26.184153 master-0 kubenswrapper[29612]: I0319 12:10:26.184114 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5cf6404-1640-43d3-8018-03c62c8338c5-config-volume\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:26.184202 master-0 kubenswrapper[29612]: I0319 12:10:26.184161 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-mcc-auth-proxy-config\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:26.184249 master-0 kubenswrapper[29612]: I0319 12:10:26.184187 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:26.184249 master-0 kubenswrapper[29612]: I0319 12:10:26.184206 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb274dd5-e98e-4485-a050-7bec560a0daa-proxy-tls\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:26.184335 master-0 kubenswrapper[29612]: I0319 12:10:26.184293 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:26.184378 master-0 kubenswrapper[29612]: I0319 12:10:26.184355 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 12:10:26.184497 master-0 kubenswrapper[29612]: I0319 12:10:26.184460 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:26.184548 master-0 kubenswrapper[29612]: I0319 12:10:26.184509 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8117f6b0-0b9f-4a1d-9807-bf73c19f388b-tls-certificates\") pod \"prometheus-operator-admission-webhook-69c6b55594-52ww6\" (UID: \"8117f6b0-0b9f-4a1d-9807-bf73c19f388b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 12:10:26.184763 master-0 kubenswrapper[29612]: I0319 12:10:26.184719 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46108693-3c9e-4804-beb0-d6b8f8df7625-service-ca-bundle\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:26.184824 master-0 kubenswrapper[29612]: I0319 12:10:26.184734 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-auth-proxy-config\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:26.184824 master-0 kubenswrapper[29612]: I0319 12:10:26.184803 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:26.185071 master-0 kubenswrapper[29612]: I0319 12:10:26.185027 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2278c1-05cf-469c-a13c-762b7a790b38-cert\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:26.185156 master-0 kubenswrapper[29612]: I0319 12:10:26.185123 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-mcd-auth-proxy-config\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:26.185156 master-0 kubenswrapper[29612]: I0319 12:10:26.185132 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5cf6404-1640-43d3-8018-03c62c8338c5-metrics-tls\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:26.185362 master-0 kubenswrapper[29612]: I0319 12:10:26.185335 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46108693-3c9e-4804-beb0-d6b8f8df7625-metrics-certs\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:26.185486 master-0 kubenswrapper[29612]: I0319 12:10:26.185459 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc25a4a-521a-4139-a10c-d5b8bba6e359-webhook-certs\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 12:10:26.185651 master-0 kubenswrapper[29612]: I0319 12:10:26.185491 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63e81c6a-b723-4c9a-b471-226fefa9c7a7-cco-trusted-ca\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:26.195896 master-0 kubenswrapper[29612]: I0319 12:10:26.195838 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 12:10:26.206315 master-0 kubenswrapper[29612]: I0319 12:10:26.206233 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eb274dd5-e98e-4485-a050-7bec560a0daa-images\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:26.217234 master-0 kubenswrapper[29612]: I0319 12:10:26.217184 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2pvrw" Mar 19 12:10:26.235215 master-0 kubenswrapper[29612]: I0319 12:10:26.235158 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 12:10:26.255534 master-0 kubenswrapper[29612]: I0319 12:10:26.255467 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 12:10:26.274029 master-0 kubenswrapper[29612]: I0319 12:10:26.273982 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9ww6r" Mar 19 12:10:26.294935 master-0 kubenswrapper[29612]: I0319 12:10:26.294773 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 12:10:26.305005 master-0 kubenswrapper[29612]: I0319 12:10:26.304946 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-service-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:26.314457 master-0 kubenswrapper[29612]: I0319 12:10:26.314398 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 12:10:26.314781 master-0 kubenswrapper[29612]: I0319 12:10:26.314705 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-serving-cert\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:26.334682 master-0 kubenswrapper[29612]: I0319 12:10:26.334558 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-v6j8n" Mar 19 12:10:26.355933 master-0 kubenswrapper[29612]: I0319 12:10:26.355856 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 12:10:26.374742 master-0 kubenswrapper[29612]: I0319 12:10:26.374248 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zgw2m" Mar 19 12:10:26.395479 master-0 kubenswrapper[29612]: I0319 12:10:26.395413 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 12:10:26.401000 master-0 kubenswrapper[29612]: I0319 12:10:26.400946 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-webhook-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:26.401242 master-0 kubenswrapper[29612]: I0319 12:10:26.401193 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-apiservice-cert\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:26.415954 master-0 kubenswrapper[29612]: I0319 12:10:26.415891 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-pshw6" Mar 19 12:10:26.435719 master-0 kubenswrapper[29612]: I0319 12:10:26.435545 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 12:10:26.440786 master-0 kubenswrapper[29612]: I0319 12:10:26.440743 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-proxy-tls\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:26.463647 master-0 kubenswrapper[29612]: I0319 12:10:26.463575 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 12:10:26.470664 master-0 kubenswrapper[29612]: I0319 12:10:26.470566 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-trusted-ca-bundle\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:26.495382 master-0 kubenswrapper[29612]: I0319 12:10:26.495300 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-h5fcf" Mar 19 12:10:26.515857 master-0 kubenswrapper[29612]: I0319 12:10:26.515800 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 12:10:26.523875 master-0 kubenswrapper[29612]: I0319 12:10:26.523782 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-auth-proxy-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:26.535148 master-0 kubenswrapper[29612]: I0319 12:10:26.535083 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 12:10:26.556490 master-0 kubenswrapper[29612]: I0319 12:10:26.556426 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 12:10:26.564516 master-0 kubenswrapper[29612]: I0319 12:10:26.564427 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-machine-approver-tls\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:26.576015 master-0 kubenswrapper[29612]: I0319 12:10:26.575923 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 12:10:26.595815 master-0 kubenswrapper[29612]: I0319 12:10:26.595722 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 12:10:26.604219 master-0 kubenswrapper[29612]: I0319 12:10:26.604141 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-config\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:26.617057 master-0 kubenswrapper[29612]: I0319 12:10:26.616960 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-8zr85" Mar 19 12:10:26.636211 master-0 kubenswrapper[29612]: I0319 12:10:26.636136 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 12:10:26.643303 master-0 kubenswrapper[29612]: I0319 12:10:26.643235 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:26.655987 master-0 kubenswrapper[29612]: I0319 12:10:26.655924 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 12:10:26.665309 master-0 kubenswrapper[29612]: I0319 12:10:26.665193 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:26.673122 master-0 kubenswrapper[29612]: I0319 12:10:26.673031 29612 request.go:700] Waited for 2.358637703s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 19 12:10:26.675052 master-0 kubenswrapper[29612]: I0319 12:10:26.675006 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 12:10:26.695931 master-0 kubenswrapper[29612]: I0319 12:10:26.695788 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 12:10:26.701857 master-0 kubenswrapper[29612]: I0319 12:10:26.701742 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-images\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:26.705297 master-0 kubenswrapper[29612]: I0319 12:10:26.705237 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 12:10:26.711159 master-0 kubenswrapper[29612]: I0319 12:10:26.711090 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:10:26.715086 master-0 kubenswrapper[29612]: I0319 12:10:26.715053 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:10:26.715348 master-0 kubenswrapper[29612]: I0319 12:10:26.715126 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:10:26.717292 master-0 kubenswrapper[29612]: I0319 12:10:26.716121 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:10:26.734962 master-0 kubenswrapper[29612]: I0319 12:10:26.734898 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:10:26.756071 master-0 kubenswrapper[29612]: I0319 12:10:26.755968 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 12:10:26.776091 master-0 kubenswrapper[29612]: I0319 12:10:26.776010 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 12:10:26.786036 master-0 kubenswrapper[29612]: I0319 12:10:26.785995 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-proxy-tls\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:26.795744 master-0 kubenswrapper[29612]: I0319 12:10:26.795691 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-kl9xx" Mar 19 12:10:26.815310 master-0 kubenswrapper[29612]: I0319 12:10:26.815244 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 12:10:26.820344 master-0 kubenswrapper[29612]: I0319 12:10:26.820300 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57d19c68-b469-4a6f-abe0-3a0eff32639c-cert\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 12:10:26.834826 master-0 kubenswrapper[29612]: I0319 12:10:26.834778 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 12:10:26.855561 master-0 kubenswrapper[29612]: I0319 12:10:26.855506 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 12:10:26.860328 master-0 kubenswrapper[29612]: I0319 12:10:26.860280 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09282e98-3140-41c9-863f-67e0e9f942cc-metrics-client-ca\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:26.860488 master-0 kubenswrapper[29612]: I0319 12:10:26.860424 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b92c4c09-a796-4e00-821a-32fc9c8ca214-metrics-client-ca\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:26.860580 master-0 kubenswrapper[29612]: I0319 12:10:26.860424 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-metrics-client-ca\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:26.864590 master-0 kubenswrapper[29612]: I0319 12:10:26.864541 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f2d1d03-7427-45a0-a917-173bf9df9902-metrics-client-ca\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:26.877587 master-0 kubenswrapper[29612]: I0319 12:10:26.877519 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-56gmj" Mar 19 12:10:26.895348 master-0 kubenswrapper[29612]: I0319 12:10:26.895283 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 12:10:26.905827 master-0 kubenswrapper[29612]: I0319 12:10:26.905779 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:26.915072 master-0 kubenswrapper[29612]: I0319 12:10:26.914882 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 12:10:26.915766 master-0 kubenswrapper[29612]: I0319 12:10:26.915731 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-certs\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:26.935974 master-0 kubenswrapper[29612]: I0319 12:10:26.935899 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-f27d9" Mar 19 12:10:26.955603 master-0 kubenswrapper[29612]: I0319 12:10:26.955492 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 12:10:26.966583 master-0 kubenswrapper[29612]: I0319 12:10:26.966524 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a49a8a5e-23fb-46fb-b184-e09da4817028-node-bootstrap-token\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:26.975463 master-0 kubenswrapper[29612]: I0319 12:10:26.975396 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-nghrc" Mar 19 12:10:26.995881 master-0 kubenswrapper[29612]: I0319 12:10:26.995508 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2dl2n" Mar 19 12:10:27.016033 master-0 kubenswrapper[29612]: I0319 12:10:27.015734 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 12:10:27.026360 master-0 kubenswrapper[29612]: I0319 12:10:27.026058 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b92c4c09-a796-4e00-821a-32fc9c8ca214-prometheus-operator-tls\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:27.041491 master-0 kubenswrapper[29612]: I0319 12:10:27.038003 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 12:10:27.042388 master-0 kubenswrapper[29612]: I0319 12:10:27.042332 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-tls\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:27.055680 master-0 kubenswrapper[29612]: I0319 12:10:27.055595 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 12:10:27.064613 master-0 kubenswrapper[29612]: I0319 12:10:27.064560 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f2d1d03-7427-45a0-a917-173bf9df9902-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:27.074755 master-0 kubenswrapper[29612]: I0319 12:10:27.074716 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 12:10:27.080788 master-0 kubenswrapper[29612]: I0319 12:10:27.080733 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:27.095947 master-0 kubenswrapper[29612]: I0319 12:10:27.095885 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 12:10:27.105557 master-0 kubenswrapper[29612]: I0319 12:10:27.105505 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09282e98-3140-41c9-863f-67e0e9f942cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:27.116599 master-0 kubenswrapper[29612]: I0319 12:10:27.116565 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-nkt4p" Mar 19 12:10:27.135685 master-0 kubenswrapper[29612]: I0319 12:10:27.135616 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 12:10:27.146312 master-0 kubenswrapper[29612]: I0319 12:10:27.146261 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:27.155573 master-0 kubenswrapper[29612]: I0319 12:10:27.155284 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 12:10:27.161581 master-0 kubenswrapper[29612]: I0319 12:10:27.161512 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:27.176903 master-0 kubenswrapper[29612]: I0319 12:10:27.176858 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 12:10:27.180485 master-0 kubenswrapper[29612]: E0319 12:10:27.180464 29612 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:27.180694 master-0 kubenswrapper[29612]: E0319 12:10:27.180680 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:28.180652487 +0000 UTC m=+35.494821508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:27.184053 master-0 kubenswrapper[29612]: E0319 12:10:27.183980 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:27.184149 master-0 kubenswrapper[29612]: E0319 12:10:27.184118 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:28.184089985 +0000 UTC m=+35.498259046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:27.184202 master-0 kubenswrapper[29612]: E0319 12:10:27.184175 29612 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-atris4eemsst: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:27.184266 master-0 kubenswrapper[29612]: E0319 12:10:27.184241 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle podName:4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c nodeName:}" failed. No retries permitted until 2026-03-19 12:10:28.184217319 +0000 UTC m=+35.498386380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle") pod "metrics-server-dc85487cf-gsmlh" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:27.184315 master-0 kubenswrapper[29612]: I0319 12:10:27.183987 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:27.185292 master-0 kubenswrapper[29612]: E0319 12:10:27.185248 29612 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:27.185363 master-0 kubenswrapper[29612]: E0319 12:10:27.185349 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:28.185331171 +0000 UTC m=+35.499500222 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync secret cache: timed out waiting for the condition Mar 19 12:10:27.185413 master-0 kubenswrapper[29612]: E0319 12:10:27.185355 29612 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:27.185460 master-0 kubenswrapper[29612]: E0319 12:10:27.185433 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap podName:4ca74427-9df7-44ca-ae91-0162f402644b nodeName:}" failed. No retries permitted until 2026-03-19 12:10:28.185409973 +0000 UTC m=+35.499579034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-7bbc969446-dbm2x" (UID: "4ca74427-9df7-44ca-ae91-0162f402644b") : failed to sync configmap cache: timed out waiting for the condition Mar 19 12:10:27.196443 master-0 kubenswrapper[29612]: I0319 12:10:27.196373 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-atris4eemsst" Mar 19 12:10:27.217118 master-0 kubenswrapper[29612]: I0319 12:10:27.216937 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-g78hg" Mar 19 12:10:27.249477 master-0 kubenswrapper[29612]: I0319 12:10:27.249382 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 12:10:27.255096 master-0 kubenswrapper[29612]: I0319 12:10:27.255045 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-wn2jn" Mar 19 12:10:27.274340 master-0 kubenswrapper[29612]: I0319 12:10:27.274273 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 12:10:27.295761 master-0 kubenswrapper[29612]: I0319 12:10:27.295707 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 12:10:27.315196 master-0 kubenswrapper[29612]: I0319 12:10:27.315106 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 12:10:27.599873 master-0 kubenswrapper[29612]: I0319 12:10:27.599756 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7bl2\" (UniqueName: \"kubernetes.io/projected/d979be56-f948-4252-ab6b-9bfc48dc4187-kube-api-access-w7bl2\") pod \"redhat-operators-6ql8d\" (UID: \"d979be56-f948-4252-ab6b-9bfc48dc4187\") " pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:27.673927 master-0 kubenswrapper[29612]: I0319 12:10:27.673867 29612 request.go:700] Waited for 3.280668441s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 19 12:10:27.686728 master-0 kubenswrapper[29612]: I0319 12:10:27.686673 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") pod \"route-controller-manager-6bcb8645f8-5z6xt\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:27.687391 master-0 kubenswrapper[29612]: I0319 12:10:27.687356 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ms4\" (UniqueName: \"kubernetes.io/projected/b0133cb5-e425-40af-a051-71b2362a89c3-kube-api-access-w4ms4\") pod \"network-operator-7bd846bfc4-5qtx7\" (UID: \"b0133cb5-e425-40af-a051-71b2362a89c3\") " pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" Mar 19 12:10:27.688132 master-0 kubenswrapper[29612]: I0319 12:10:27.688093 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgtrh\" (UniqueName: \"kubernetes.io/projected/64136215-8539-4349-977f-6d59deb132ea-kube-api-access-cgtrh\") pod \"iptables-alerter-j6n2j\" (UID: \"64136215-8539-4349-977f-6d59deb132ea\") " pod="openshift-network-operator/iptables-alerter-j6n2j" Mar 19 12:10:27.694189 master-0 kubenswrapper[29612]: I0319 12:10:27.694147 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr4k\" (UniqueName: \"kubernetes.io/projected/3e1b71a7-c9fd-4f03-8d0c-f092dac80989-kube-api-access-mgr4k\") pod \"network-metrics-daemon-nkd77\" (UID: \"3e1b71a7-c9fd-4f03-8d0c-f092dac80989\") " pod="openshift-multus/network-metrics-daemon-nkd77" Mar 19 12:10:27.698332 master-0 kubenswrapper[29612]: E0319 12:10:27.698279 29612 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.373s" Mar 19 12:10:27.698438 master-0 kubenswrapper[29612]: I0319 12:10:27.698346 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:27.698438 master-0 kubenswrapper[29612]: I0319 12:10:27.698367 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:27.698438 master-0 kubenswrapper[29612]: I0319 12:10:27.698379 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerStarted","Data":"5e0c0c445c254e95b1792fac91e1a62fba7b29c5a016186d5923ee9ea180aac4"} Mar 19 12:10:27.698438 master-0 kubenswrapper[29612]: I0319 12:10:27.698411 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:10:27.698568 master-0 kubenswrapper[29612]: I0319 12:10:27.698465 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:27.698568 master-0 kubenswrapper[29612]: I0319 12:10:27.698481 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 19 12:10:27.698568 master-0 kubenswrapper[29612]: I0319 12:10:27.698493 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:10:27.698568 master-0 kubenswrapper[29612]: I0319 12:10:27.698510 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerDied","Data":"49a4f50ee477671b0b5a448101a3e5ff8df04706fa3923595f4f02465bdb07e9"} Mar 19 12:10:27.698708 master-0 kubenswrapper[29612]: I0319 12:10:27.698585 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:27.705750 master-0 kubenswrapper[29612]: I0319 12:10:27.705711 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac946049-4b93-4d8c-bfa6-eacf83160ed1-kube-api-access\") pod \"kube-controller-manager-operator-ff989d6cc-8xkv5\" (UID: \"ac946049-4b93-4d8c-bfa6-eacf83160ed1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" Mar 19 12:10:27.706194 master-0 kubenswrapper[29612]: I0319 12:10:27.706156 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") pod \"controller-manager-79589db566-gb7tq\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:27.707969 master-0 kubenswrapper[29612]: I0319 12:10:27.707916 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7qsg\" (UniqueName: \"kubernetes.io/projected/33f6e54f-a750-4c91-b4f5-049275d33cfb-kube-api-access-f7qsg\") pod \"dns-operator-9c5679d8f-m5dt4\" (UID: \"33f6e54f-a750-4c91-b4f5-049275d33cfb\") " pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" Mar 19 12:10:27.709400 master-0 kubenswrapper[29612]: I0319 12:10:27.709347 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 19 12:10:27.713829 master-0 kubenswrapper[29612]: I0319 12:10:27.713788 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-bound-sa-token\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:27.719426 master-0 kubenswrapper[29612]: I0319 12:10:27.719379 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4khz\" (UniqueName: \"kubernetes.io/projected/a5d009a3-5ff2-49f9-9cea-004b99729b77-kube-api-access-z4khz\") pod \"cluster-storage-operator-7d87854d6-qzxkm\" (UID: \"a5d009a3-5ff2-49f9-9cea-004b99729b77\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" Mar 19 12:10:27.723407 master-0 kubenswrapper[29612]: I0319 12:10:27.723364 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frf6\" (UniqueName: \"kubernetes.io/projected/07eef031-edcc-415c-91d7-f9f3bd0a45e3-kube-api-access-6frf6\") pod \"tuned-9w65f\" (UID: \"07eef031-edcc-415c-91d7-f9f3bd0a45e3\") " pod="openshift-cluster-node-tuning-operator/tuned-9w65f" Mar 19 12:10:27.727851 master-0 kubenswrapper[29612]: I0319 12:10:27.727805 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd74j\" (UniqueName: \"kubernetes.io/projected/24bd6a9b-9934-4cf9-b591-1ab892c8014c-kube-api-access-jd74j\") pod \"certified-operators-4h84v\" (UID: \"24bd6a9b-9934-4cf9-b591-1ab892c8014c\") " pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:27.729945 master-0 kubenswrapper[29612]: I0319 12:10:27.729905 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h95pm\" (UniqueName: \"kubernetes.io/projected/162b513f-2f67-4946-816b-48f6232dfca0-kube-api-access-h95pm\") pod \"authentication-operator-5885bfd7f4-vt498\" (UID: \"162b513f-2f67-4946-816b-48f6232dfca0\") " pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" Mar 19 12:10:27.730849 master-0 kubenswrapper[29612]: I0319 12:10:27.730814 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjkt\" (UniqueName: \"kubernetes.io/projected/0c804ea0-3483-4506-b138-3bf30016d8d8-kube-api-access-pxjkt\") pod \"csi-snapshot-controller-64854d9cff-6dlpw\" (UID: \"0c804ea0-3483-4506-b138-3bf30016d8d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" Mar 19 12:10:27.738310 master-0 kubenswrapper[29612]: I0319 12:10:27.738261 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5cs\" (UniqueName: \"kubernetes.io/projected/4d790b42-2182-4b9c-8391-285e938715cc-kube-api-access-zh5cs\") pod \"migrator-8487694857-52lv2\" (UID: \"4d790b42-2182-4b9c-8391-285e938715cc\") " pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" Mar 19 12:10:27.738757 master-0 kubenswrapper[29612]: I0319 12:10:27.738554 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69pp5\" (UniqueName: \"kubernetes.io/projected/4ca74427-9df7-44ca-ae91-0162f402644b-kube-api-access-69pp5\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:27.742949 master-0 kubenswrapper[29612]: I0319 12:10:27.742909 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/e47d3935-d96e-4fd2-abfc-99bff7eac5e0-kube-api-access-t85jj\") pod \"cluster-cloud-controller-manager-operator-7dff898856-8pnq2\" (UID: \"e47d3935-d96e-4fd2-abfc-99bff7eac5e0\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" Mar 19 12:10:27.743207 master-0 kubenswrapper[29612]: I0319 12:10:27.743175 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rhj\" (UniqueName: \"kubernetes.io/projected/94fa5656-0561-45ab-9ab1-a6b96b8a47d4-kube-api-access-m6rhj\") pod \"package-server-manager-7b95f86987-ftz5b\" (UID: \"94fa5656-0561-45ab-9ab1-a6b96b8a47d4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:10:27.860615 master-0 kubenswrapper[29612]: I0319 12:10:27.859869 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98qv\" (UniqueName: \"kubernetes.io/projected/5b46c6b4-f701-4a7f-bf74-90afe14ad89a-kube-api-access-s98qv\") pod \"olm-operator-5c9796789-jsp5b\" (UID: \"5b46c6b4-f701-4a7f-bf74-90afe14ad89a\") " pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 12:10:27.917386 master-0 kubenswrapper[29612]: I0319 12:10:27.916675 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4xb\" (UniqueName: \"kubernetes.io/projected/3a185e40-e97b-49e9-aea3-5f170401a9e3-kube-api-access-2v4xb\") pod \"cluster-samples-operator-85f7577d78-rbshp\" (UID: \"3a185e40-e97b-49e9-aea3-5f170401a9e3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" Mar 19 12:10:27.949675 master-0 kubenswrapper[29612]: I0319 12:10:27.946877 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99wt\" (UniqueName: \"kubernetes.io/projected/6d017409-a362-4e58-87af-d02b3834fdd4-kube-api-access-f99wt\") pod \"kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv\" (UID: \"6d017409-a362-4e58-87af-d02b3834fdd4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" Mar 19 12:10:27.954990 master-0 kubenswrapper[29612]: I0319 12:10:27.954954 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4gg\" (UniqueName: \"kubernetes.io/projected/7af28c70-3ee0-438b-8655-95bc57deaa05-kube-api-access-2m4gg\") pod \"apiserver-66dcf44d9d-46w4r\" (UID: \"7af28c70-3ee0-438b-8655-95bc57deaa05\") " pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:27.955337 master-0 kubenswrapper[29612]: I0319 12:10:27.955290 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4qb9\" (UniqueName: \"kubernetes.io/projected/f38df702-a256-4f83-90bb-0c0e477a2ce6-kube-api-access-g4qb9\") pod \"ingress-operator-66b84d69b-59ptg\" (UID: \"f38df702-a256-4f83-90bb-0c0e477a2ce6\") " pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" Mar 19 12:10:27.955952 master-0 kubenswrapper[29612]: I0319 12:10:27.955908 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5z4r\" (UniqueName: \"kubernetes.io/projected/a49a8a5e-23fb-46fb-b184-e09da4817028-kube-api-access-k5z4r\") pod \"machine-config-server-x66j8\" (UID: \"a49a8a5e-23fb-46fb-b184-e09da4817028\") " pod="openshift-machine-config-operator/machine-config-server-x66j8" Mar 19 12:10:27.959660 master-0 kubenswrapper[29612]: I0319 12:10:27.956285 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdmj8\" (UniqueName: \"kubernetes.io/projected/6e8c4676-05af-4478-a44b-1bdcbefdea0f-kube-api-access-pdmj8\") pod \"catalog-operator-68f85b4d6c-qldxf\" (UID: \"6e8c4676-05af-4478-a44b-1bdcbefdea0f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 12:10:27.959660 master-0 kubenswrapper[29612]: I0319 12:10:27.956776 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgsm4\" (UniqueName: \"kubernetes.io/projected/46108693-3c9e-4804-beb0-d6b8f8df7625-kube-api-access-wgsm4\") pod \"router-default-7dcf5569b5-47tqf\" (UID: \"46108693-3c9e-4804-beb0-d6b8f8df7625\") " pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:27.959660 master-0 kubenswrapper[29612]: I0319 12:10:27.957135 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r97ml\" (UniqueName: \"kubernetes.io/projected/63e81c6a-b723-4c9a-b471-226fefa9c7a7-kube-api-access-r97ml\") pod \"cloud-credential-operator-744f9dbf77-9r2n5\" (UID: \"63e81c6a-b723-4c9a-b471-226fefa9c7a7\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" Mar 19 12:10:27.960136 master-0 kubenswrapper[29612]: I0319 12:10:27.960112 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstrq\" (UniqueName: \"kubernetes.io/projected/f93ee7db-f0ed-470f-a458-86c54e6d064a-kube-api-access-jstrq\") pod \"network-check-source-b4bf74f6-5jjzp\" (UID: \"f93ee7db-f0ed-470f-a458-86c54e6d064a\") " pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" Mar 19 12:10:27.962558 master-0 kubenswrapper[29612]: I0319 12:10:27.962512 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvp77\" (UniqueName: \"kubernetes.io/projected/d52981e0-0bc4-4dfa-81d5-aeab63fa23e0-kube-api-access-rvp77\") pod \"control-plane-machine-set-operator-6f97756bc8-9fvr8\" (UID: \"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0\") " pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" Mar 19 12:10:28.119858 master-0 kubenswrapper[29612]: I0319 12:10:28.119752 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dp9\" (UniqueName: \"kubernetes.io/projected/50adfdd3-1d4d-47d2-a35b-cd028d66b4c6-kube-api-access-f7dp9\") pod \"cluster-olm-operator-67dcd4998-54pv6\" (UID: \"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" Mar 19 12:10:28.123223 master-0 kubenswrapper[29612]: I0319 12:10:28.123176 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssz9\" (UniqueName: \"kubernetes.io/projected/4705c8f3-89d4-42f2-ad14-e591c5380b9e-kube-api-access-bssz9\") pod \"openshift-config-operator-95bf4f4d-9bqjr\" (UID: \"4705c8f3-89d4-42f2-ad14-e591c5380b9e\") " pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:28.124365 master-0 kubenswrapper[29612]: I0319 12:10:28.124321 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntj9\" (UniqueName: \"kubernetes.io/projected/66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e-kube-api-access-8ntj9\") pod \"apiserver-5c7d7c6bb8-gnz2z\" (UID: \"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e\") " pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:28.125810 master-0 kubenswrapper[29612]: I0319 12:10:28.125785 29612 scope.go:117] "RemoveContainer" containerID="2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111" Mar 19 12:10:28.127455 master-0 kubenswrapper[29612]: I0319 12:10:28.127425 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctghm\" (UniqueName: \"kubernetes.io/projected/60081e3b-4fe5-42a3-bfae-0b07853c0e36-kube-api-access-ctghm\") pod \"operator-controller-controller-manager-57777556ff-vxsl4\" (UID: \"60081e3b-4fe5-42a3-bfae-0b07853c0e36\") " pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:28.128493 master-0 kubenswrapper[29612]: I0319 12:10:28.128456 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fmb\" (UniqueName: \"kubernetes.io/projected/39f479b3-45ff-43f9-ae85-083554a54918-kube-api-access-79fmb\") pod \"marketplace-operator-89ccd998f-4h6xn\" (UID: \"39f479b3-45ff-43f9-ae85-083554a54918\") " pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:28.136725 master-0 kubenswrapper[29612]: I0319 12:10:28.136676 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwp24\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-kube-api-access-bwp24\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:28.141384 master-0 kubenswrapper[29612]: I0319 12:10:28.141354 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4cz\" (UniqueName: \"kubernetes.io/projected/4e7b04dc-e85d-41b8-959c-87b1b24731a1-kube-api-access-9h4cz\") pod \"service-ca-79bc6b8d76-pdjtv\" (UID: \"4e7b04dc-e85d-41b8-959c-87b1b24731a1\") " pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" Mar 19 12:10:28.148669 master-0 kubenswrapper[29612]: I0319 12:10:28.146832 29612 scope.go:117] "RemoveContainer" containerID="302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30" Mar 19 12:10:28.154670 master-0 kubenswrapper[29612]: I0319 12:10:28.153147 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vzl\" (UniqueName: \"kubernetes.io/projected/84e5577e-df8b-46a2-aff7-0bf90b5010d9-kube-api-access-k9vzl\") pod \"openshift-apiserver-operator-d65958b8-ptjr9\" (UID: \"84e5577e-df8b-46a2-aff7-0bf90b5010d9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" Mar 19 12:10:28.154670 master-0 kubenswrapper[29612]: I0319 12:10:28.153760 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvpz\" (UniqueName: \"kubernetes.io/projected/eb274dd5-e98e-4485-a050-7bec560a0daa-kube-api-access-jsvpz\") pod \"machine-config-operator-84d549f6d5-ndc5t\" (UID: \"eb274dd5-e98e-4485-a050-7bec560a0daa\") " pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" Mar 19 12:10:28.161662 master-0 kubenswrapper[29612]: I0319 12:10:28.157690 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5cc5\" (UniqueName: \"kubernetes.io/projected/81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a-kube-api-access-v5cc5\") pod \"machine-approver-5c6485487f-b2w5m\" (UID: \"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a\") " pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" Mar 19 12:10:28.161662 master-0 kubenswrapper[29612]: I0319 12:10:28.157887 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6qg\" (UniqueName: \"kubernetes.io/projected/b92c4c09-a796-4e00-821a-32fc9c8ca214-kube-api-access-6h6qg\") pod \"prometheus-operator-6c8df6d4b-tbjxv\" (UID: \"b92c4c09-a796-4e00-821a-32fc9c8ca214\") " pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" Mar 19 12:10:28.162719 master-0 kubenswrapper[29612]: I0319 12:10:28.162337 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bptbc\" (UniqueName: \"kubernetes.io/projected/e617e515-a899-41e8-9961-df68e49da4d3-kube-api-access-bptbc\") pod \"ovnkube-node-hm6q6\" (UID: \"e617e515-a899-41e8-9961-df68e49da4d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.193498 master-0 kubenswrapper[29612]: I0319 12:10:28.190314 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qkq\" (UniqueName: \"kubernetes.io/projected/65b629a3-ea10-4517-b762-b18b22fab2a9-kube-api-access-q9qkq\") pod \"network-check-target-tz4qg\" (UID: \"65b629a3-ea10-4517-b762-b18b22fab2a9\") " pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 12:10:28.210565 master-0 kubenswrapper[29612]: I0319 12:10:28.210539 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhrb\" (UniqueName: \"kubernetes.io/projected/bcc25a4a-521a-4139-a10c-d5b8bba6e359-kube-api-access-2vhrb\") pod \"multus-admission-controller-58c9f8fc64-vc6n8\" (UID: \"bcc25a4a-521a-4139-a10c-d5b8bba6e359\") " pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" Mar 19 12:10:28.226382 master-0 kubenswrapper[29612]: I0319 12:10:28.226333 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8765q\" (UniqueName: \"kubernetes.io/projected/b69a314d-4884-4f2c-a48e-1cf68c5aaef4-kube-api-access-8765q\") pod \"packageserver-86c8b5dddf-j95tg\" (UID: \"b69a314d-4884-4f2c-a48e-1cf68c5aaef4\") " pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:28.246626 master-0 kubenswrapper[29612]: I0319 12:10:28.246581 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4kn\" (UniqueName: \"kubernetes.io/projected/78d3b2df-3d81-413e-b039-80dc20111eae-kube-api-access-mm4kn\") pod \"cluster-node-tuning-operator-598fbc5f8f-vcxdb\" (UID: \"78d3b2df-3d81-413e-b039-80dc20111eae\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" Mar 19 12:10:28.248865 master-0 kubenswrapper[29612]: I0319 12:10:28.248697 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:28.248865 master-0 kubenswrapper[29612]: I0319 12:10:28.248844 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:28.249162 master-0 kubenswrapper[29612]: I0319 12:10:28.249137 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.249318 master-0 kubenswrapper[29612]: I0319 12:10:28.249298 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:28.249719 master-0 kubenswrapper[29612]: I0319 12:10:28.249701 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.249881 master-0 kubenswrapper[29612]: I0319 12:10:28.249405 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.249989 master-0 kubenswrapper[29612]: I0319 12:10:28.249390 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-tls\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:28.250098 master-0 kubenswrapper[29612]: I0319 12:10:28.249625 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:28.250183 master-0 kubenswrapper[29612]: I0319 12:10:28.249405 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ca74427-9df7-44ca-ae91-0162f402644b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7bbc969446-dbm2x\" (UID: \"4ca74427-9df7-44ca-ae91-0162f402644b\") " pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" Mar 19 12:10:28.250597 master-0 kubenswrapper[29612]: I0319 12:10:28.250578 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.260121 master-0 kubenswrapper[29612]: I0319 12:10:28.260040 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gczvq\" (UniqueName: \"kubernetes.io/projected/4d625e92-793e-4942-a826-d30d21e3ca0c-kube-api-access-gczvq\") pod \"redhat-marketplace-6zxlv\" (UID: \"4d625e92-793e-4942-a826-d30d21e3ca0c\") " pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:28.330393 master-0 kubenswrapper[29612]: I0319 12:10:28.330337 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8qjv\" (UniqueName: \"kubernetes.io/projected/57d19c68-b469-4a6f-abe0-3a0eff32639c-kube-api-access-v8qjv\") pod \"ingress-canary-tzfl9\" (UID: \"57d19c68-b469-4a6f-abe0-3a0eff32639c\") " pod="openshift-ingress-canary/ingress-canary-tzfl9" Mar 19 12:10:28.351153 master-0 kubenswrapper[29612]: I0319 12:10:28.350361 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cvpt\" (UniqueName: \"kubernetes.io/projected/3f3c0234-248a-4d6f-9aca-6582a481a911-kube-api-access-8cvpt\") pod \"service-ca-operator-b865698dc-czzvw\" (UID: \"3f3c0234-248a-4d6f-9aca-6582a481a911\") " pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" Mar 19 12:10:28.356605 master-0 kubenswrapper[29612]: I0319 12:10:28.356566 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2knz\" (UniqueName: \"kubernetes.io/projected/e90c10f6-36d9-44bd-a4c6-174f12135434-kube-api-access-r2knz\") pod \"cluster-baremetal-operator-6f69995874-cwncj\" (UID: \"e90c10f6-36d9-44bd-a4c6-174f12135434\") " pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" Mar 19 12:10:28.367046 master-0 kubenswrapper[29612]: I0319 12:10:28.367005 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd875\" (UniqueName: \"kubernetes.io/projected/99ffcc9a-4b01-439a-93e0-0674bdfd32ec-kube-api-access-jd875\") pod \"csi-snapshot-controller-operator-5f5d689c6b-8nptp\" (UID: \"99ffcc9a-4b01-439a-93e0-0674bdfd32ec\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" Mar 19 12:10:28.368461 master-0 kubenswrapper[29612]: I0319 12:10:28.368423 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sprd\" (UniqueName: \"kubernetes.io/projected/131e2573-ef74-4963-bdcf-901310e7a486-kube-api-access-4sprd\") pod \"cluster-monitoring-operator-58845fbb57-j5rwg\" (UID: \"131e2573-ef74-4963-bdcf-901310e7a486\") " pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" Mar 19 12:10:28.380546 master-0 kubenswrapper[29612]: I0319 12:10:28.380427 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwcn\" (UniqueName: \"kubernetes.io/projected/d263ddac-1ede-474f-b3bb-1f1f4f7846a3-kube-api-access-rbwcn\") pod \"openshift-controller-manager-operator-8c94f4649-q46ht\" (UID: \"d263ddac-1ede-474f-b3bb-1f1f4f7846a3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" Mar 19 12:10:28.400996 master-0 kubenswrapper[29612]: I0319 12:10:28.400949 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65be54c-d851-4daa-93a0-8a3947da5e26-kube-api-access\") pod \"cluster-version-operator-7d58488df-4fpvq\" (UID: \"f65be54c-d851-4daa-93a0-8a3947da5e26\") " pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" Mar 19 12:10:28.408962 master-0 kubenswrapper[29612]: I0319 12:10:28.408917 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b48q2\" (UniqueName: \"kubernetes.io/projected/a5cf6404-1640-43d3-8018-03c62c8338c5-kube-api-access-b48q2\") pod \"dns-default-pgl92\" (UID: \"a5cf6404-1640-43d3-8018-03c62c8338c5\") " pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:28.444191 master-0 kubenswrapper[29612]: I0319 12:10:28.444144 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjj2t\" (UniqueName: \"kubernetes.io/projected/58a4e3af-c5e1-4487-92a2-81dfb696695e-kube-api-access-gjj2t\") pod \"catalogd-controller-manager-6864dc98f7-4np9g\" (UID: \"58a4e3af-c5e1-4487-92a2-81dfb696695e\") " pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:28.457929 master-0 kubenswrapper[29612]: I0319 12:10:28.457887 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/367ea4e8-8172-4730-8f26-499ed7c167bb-kube-api-access\") pod \"kube-apiserver-operator-8b68b9d9b-4tqrh\" (UID: \"367ea4e8-8172-4730-8f26-499ed7c167bb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" Mar 19 12:10:28.485233 master-0 kubenswrapper[29612]: I0319 12:10:28.485180 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bea42722-d7af-4938-9f3a-da57bd056f96-kube-api-access\") pod \"openshift-kube-scheduler-operator-dddff6458-hgwgj\" (UID: \"bea42722-d7af-4938-9f3a-da57bd056f96\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" Mar 19 12:10:28.512194 master-0 kubenswrapper[29612]: I0319 12:10:28.512139 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khlg\" (UniqueName: \"kubernetes.io/projected/518e1c6f-7f46-4dda-84a3-49fec9fa1abe-kube-api-access-4khlg\") pod \"ovnkube-control-plane-57f769d897-lhhbj\" (UID: \"518e1c6f-7f46-4dda-84a3-49fec9fa1abe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" Mar 19 12:10:28.513789 master-0 kubenswrapper[29612]: I0319 12:10:28.513763 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9g5b\" (UniqueName: \"kubernetes.io/projected/8b4a2e47-e85a-4474-a1b7-3eb85283dfcf-kube-api-access-x9g5b\") pod \"insights-operator-68bf6ff9d6-jn2px\" (UID: \"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf\") " pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" Mar 19 12:10:28.534450 master-0 kubenswrapper[29612]: I0319 12:10:28.534385 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/07804be1-d37c-474e-a179-01ac541d9705-bound-sa-token\") pod \"cluster-image-registry-operator-5549dc66cb-5twg4\" (UID: \"07804be1-d37c-474e-a179-01ac541d9705\") " pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" Mar 19 12:10:28.558835 master-0 kubenswrapper[29612]: I0319 12:10:28.558769 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5f65\" (UniqueName: \"kubernetes.io/projected/fbde1a69-b56c-467e-a291-9a83c448346f-kube-api-access-j5f65\") pod \"machine-api-operator-6fbb6cf6f9-c99xj\" (UID: \"fbde1a69-b56c-467e-a291-9a83c448346f\") " pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" Mar 19 12:10:28.579014 master-0 kubenswrapper[29612]: I0319 12:10:28.578937 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9zjb\" (UniqueName: \"kubernetes.io/projected/f5148a50-4332-4ab4-af63-449ef7f16333-kube-api-access-p9zjb\") pod \"node-resolver-j2kp7\" (UID: \"f5148a50-4332-4ab4-af63-449ef7f16333\") " pod="openshift-dns/node-resolver-j2kp7" Mar 19 12:10:28.599882 master-0 kubenswrapper[29612]: I0319 12:10:28.599814 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rztqj\" (UniqueName: \"kubernetes.io/projected/0f2278c1-05cf-469c-a13c-762b7a790b38-kube-api-access-rztqj\") pod \"cluster-autoscaler-operator-866dc4744-l2whx\" (UID: \"0f2278c1-05cf-469c-a13c-762b7a790b38\") " pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" Mar 19 12:10:28.648895 master-0 kubenswrapper[29612]: I0319 12:10:28.648837 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt2h\" (UniqueName: \"kubernetes.io/projected/1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6-kube-api-access-pxt2h\") pod \"machine-config-controller-b4f87c5b9-5sh6j\" (UID: \"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6\") " pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" Mar 19 12:10:28.673389 master-0 kubenswrapper[29612]: I0319 12:10:28.673306 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8mds\" (UniqueName: \"kubernetes.io/projected/51e2c1d1-a50f-4a0d-8273-440e699b3907-kube-api-access-j8mds\") pod \"network-node-identity-47j68\" (UID: \"51e2c1d1-a50f-4a0d-8273-440e699b3907\") " pod="openshift-network-node-identity/network-node-identity-47j68" Mar 19 12:10:28.693904 master-0 kubenswrapper[29612]: I0319 12:10:28.693854 29612 request.go:700] Waited for 4.292234752s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-multus/serviceaccounts/default/token Mar 19 12:10:28.706409 master-0 kubenswrapper[29612]: I0319 12:10:28.706360 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbbm\" (UniqueName: \"kubernetes.io/projected/003e840c-1b07-4624-8225-e4ffdab74213-kube-api-access-hmbbm\") pod \"community-operators-tzckk\" (UID: \"003e840c-1b07-4624-8225-e4ffdab74213\") " pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:28.708273 master-0 kubenswrapper[29612]: I0319 12:10:28.708243 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cz7\" (UniqueName: \"kubernetes.io/projected/5f2d1d03-7427-45a0-a917-173bf9df9902-kube-api-access-z7cz7\") pod \"node-exporter-d48mb\" (UID: \"5f2d1d03-7427-45a0-a917-173bf9df9902\") " pod="openshift-monitoring/node-exporter-d48mb" Mar 19 12:10:28.713132 master-0 kubenswrapper[29612]: I0319 12:10:28.713102 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxm2c\" (UniqueName: \"kubernetes.io/projected/c479beae-6d34-4d49-81c8-39f6a53ff6ca-kube-api-access-xxm2c\") pod \"multus-additional-cni-plugins-n5wqf\" (UID: \"c479beae-6d34-4d49-81c8-39f6a53ff6ca\") " pod="openshift-multus/multus-additional-cni-plugins-n5wqf" Mar 19 12:10:28.736010 master-0 kubenswrapper[29612]: I0319 12:10:28.735962 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-66b84d69b-59ptg_f38df702-a256-4f83-90bb-0c0e477a2ce6/ingress-operator/4.log" Mar 19 12:10:28.740907 master-0 kubenswrapper[29612]: I0319 12:10:28.740798 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") pod \"metrics-server-dc85487cf-gsmlh\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.746721 master-0 kubenswrapper[29612]: I0319 12:10:28.746449 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kf8c\" (UniqueName: \"kubernetes.io/projected/0fed1dcc-6fc0-41ef-8236-e095dca4ddcc-kube-api-access-9kf8c\") pod \"multus-p5wll\" (UID: \"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc\") " pod="openshift-multus/multus-p5wll" Mar 19 12:10:28.768014 master-0 kubenswrapper[29612]: I0319 12:10:28.767971 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scp9p\" (UniqueName: \"kubernetes.io/projected/aab47a39-a18b-4350-b141-4d5de2d8e96f-kube-api-access-scp9p\") pod \"etcd-operator-8544cbcf9c-bj75g\" (UID: \"aab47a39-a18b-4350-b141-4d5de2d8e96f\") " pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" Mar 19 12:10:28.770125 master-0 kubenswrapper[29612]: I0319 12:10:28.770090 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9hnq\" (UniqueName: \"kubernetes.io/projected/09282e98-3140-41c9-863f-67e0e9f942cc-kube-api-access-x9hnq\") pod \"openshift-state-metrics-5dc6c74576-npqhj\" (UID: \"09282e98-3140-41c9-863f-67e0e9f942cc\") " pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" Mar 19 12:10:28.800388 master-0 kubenswrapper[29612]: I0319 12:10:28.800345 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn94k\" (UniqueName: \"kubernetes.io/projected/8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0-kube-api-access-rn94k\") pod \"machine-config-daemon-tmgtz\" (UID: \"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0\") " pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" Mar 19 12:10:28.866132 master-0 kubenswrapper[29612]: E0319 12:10:28.866033 29612 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.167s" Mar 19 12:10:28.866132 master-0 kubenswrapper[29612]: I0319 12:10:28.866093 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 12:10:28.866132 master-0 kubenswrapper[29612]: I0319 12:10:28.866110 29612 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="27b79ae3-e8c0-4562-b1dc-164c9cc7ebb6" Mar 19 12:10:28.866132 master-0 kubenswrapper[29612]: I0319 12:10:28.866136 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 19 12:10:28.866132 master-0 kubenswrapper[29612]: I0319 12:10:28.866148 29612 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="27b79ae3-e8c0-4562-b1dc-164c9cc7ebb6" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866161 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" event={"ID":"7af28c70-3ee0-438b-8655-95bc57deaa05","Type":"ContainerStarted","Data":"76d682fd002c8935fad25bd3dbe4a20bd96e06027fc3726ac0709746cb6820f3"} Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866219 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866344 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866388 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"eb4c5b33-58c8-4b51-b71b-1befc165f2b1","Type":"ContainerDied","Data":"bff053145b8a60a46c397e36d3fff2a19f60020e8f6b7b60fcc48ee33442df16"} Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866472 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866491 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866503 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866516 29612 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866527 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"eb4c5b33-58c8-4b51-b71b-1befc165f2b1","Type":"ContainerDied","Data":"4d0a460410b93716bcb9bd13e502416121bea8d7a0c8c686d890ff048f639230"} Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866540 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d0a460410b93716bcb9bd13e502416121bea8d7a0c8c686d890ff048f639230" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866554 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866566 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" event={"ID":"3a185e40-e97b-49e9-aea3-5f170401a9e3","Type":"ContainerStarted","Data":"0e2ded69012d6d01b6db55fbbd61785c43491f5fa298adf54541882050773b91"} Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866582 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" event={"ID":"3a185e40-e97b-49e9-aea3-5f170401a9e3","Type":"ContainerStarted","Data":"bdda154b1c9b78e1ad36a436d43e4b7ab0a54ce3c77530fd643cb895217936f1"} Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866661 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866683 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 19 12:10:28.866691 master-0 kubenswrapper[29612]: I0319 12:10:28.866697 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866734 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866748 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866826 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866895 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866937 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866953 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866973 29612 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://76d14e466053de21cd4b44b01526664874f27eff3ad2faef8933296d68c35040" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.866983 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.867020 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.867213 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 19 12:10:28.867755 master-0 kubenswrapper[29612]: I0319 12:10:28.867700 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:28.868601 master-0 kubenswrapper[29612]: I0319 12:10:28.868223 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 12:10:28.868601 master-0 kubenswrapper[29612]: I0319 12:10:28.868259 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-85f7577d78-rbshp" event={"ID":"3a185e40-e97b-49e9-aea3-5f170401a9e3","Type":"ContainerStarted","Data":"d34c10b1c43979eaba12a772c86e5b04cdb550f06f5159184d7c1b781b1802f8"} Mar 19 12:10:28.868601 master-0 kubenswrapper[29612]: I0319 12:10:28.868404 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" Mar 19 12:10:28.868601 master-0 kubenswrapper[29612]: I0319 12:10:28.868454 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"7b2fe012b6b7f7dc2d751bb678215da259af72548d5ad40a959bc935d0c78ee7"} Mar 19 12:10:28.868601 master-0 kubenswrapper[29612]: I0319 12:10:28.868486 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"cd74682d52422c519cea30b441c3a0e7c9f2dc07a0f3f2728bf32cc48f37ec2b"} Mar 19 12:10:28.868601 master-0 kubenswrapper[29612]: I0319 12:10:28.868501 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"9d99fa05c7a5c6fa4b07e9ced3544d4c3af285d39eefa7a5647c21477748c4fd"} Mar 19 12:10:28.869967 master-0 kubenswrapper[29612]: I0319 12:10:28.869570 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:28.869967 master-0 kubenswrapper[29612]: I0319 12:10:28.869625 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:10:28.869967 master-0 kubenswrapper[29612]: I0319 12:10:28.869665 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" Mar 19 12:10:28.870133 master-0 kubenswrapper[29612]: I0319 12:10:28.870084 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:28.870133 master-0 kubenswrapper[29612]: I0319 12:10:28.870116 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5dc6c74576-npqhj" event={"ID":"09282e98-3140-41c9-863f-67e0e9f942cc","Type":"ContainerStarted","Data":"c30aed6efb2687e8c9c0787c5b85c3ecc5ac073733a122b5233bd897825e7a16"} Mar 19 12:10:28.870218 master-0 kubenswrapper[29612]: I0319 12:10:28.870148 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:28.870218 master-0 kubenswrapper[29612]: I0319 12:10:28.870160 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerStarted","Data":"befd94b8f19fd722cf8884718907d3b2c42d5f4a6e300a5487fce3bd4b878dd3"} Mar 19 12:10:28.870218 master-0 kubenswrapper[29612]: I0319 12:10:28.870173 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:28.870218 master-0 kubenswrapper[29612]: I0319 12:10:28.870182 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerDied","Data":"63b90693b783d2344a0175569017503f87f560c1b75623dba968fbe7562bf39a"} Mar 19 12:10:28.870218 master-0 kubenswrapper[29612]: I0319 12:10:28.870208 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:28.870218 master-0 kubenswrapper[29612]: I0319 12:10:28.870222 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerDied","Data":"47183b768a7e3ff5da057dffa4e751f7869a36ae402360723f1788dfd4f7f643"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870232 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerDied","Data":"76315438a4a762dabbb5d7e77ae1a8b1b6d345329e45b5b2c2feae26b7df9765"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870246 29612 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870264 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870273 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-67dcd4998-54pv6" event={"ID":"50adfdd3-1d4d-47d2-a35b-cd028d66b4c6","Type":"ContainerStarted","Data":"781654899d52ad01c57541fb366677101c374a60f21c51b9f34aad61be33ef43"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870283 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870302 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870310 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerDied","Data":"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870324 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"e3d494447990668745e14695bfe1a5e74ab5c046360e769658a99bdf18dabcb5"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870349 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870360 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" event={"ID":"6e8c4676-05af-4478-a44b-1bdcbefdea0f","Type":"ContainerStarted","Data":"457f9713239213c35b97b7cfbf842ed633e63353105428ab228603e6b8c79dd8"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870371 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" event={"ID":"6e8c4676-05af-4478-a44b-1bdcbefdea0f","Type":"ContainerStarted","Data":"c88934bb050c8e715da2068469a1a7b6e480c5b3698e644706a803df72d12f5f"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870398 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870410 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595"} Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870446 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870478 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 12:10:28.870485 master-0 kubenswrapper[29612]: I0319 12:10:28.870490 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"8e7a82869988463543d3d8dd1f0b5fe3","Type":"ContainerStarted","Data":"8dd5c45a98fabae2b43541f63682ae158d8af993ab1c5516b535075299ccb403"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.870696 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.870725 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" event={"ID":"33f6e54f-a750-4c91-b4f5-049275d33cfb","Type":"ContainerStarted","Data":"a600bd8f0138af8f048f9cf6ac0b29d4aabccd1b0f117b61a0f3f451ef5ef8af"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.870737 29612 scope.go:117] "RemoveContainer" containerID="246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.870747 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" event={"ID":"33f6e54f-a750-4c91-b4f5-049275d33cfb","Type":"ContainerStarted","Data":"0ca8349da9d248f4a62f6d3f31584f3be56f6e92cb3e5edd7a03d771500af4eb"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871340 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871389 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-9c5679d8f-m5dt4" event={"ID":"33f6e54f-a750-4c91-b4f5-049275d33cfb","Type":"ContainerStarted","Data":"768bbdfdb22a3faa73743433476fcca355077a35b64380dacae14d6ec58d72ab"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871449 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871466 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"76d14e466053de21cd4b44b01526664874f27eff3ad2faef8933296d68c35040"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871482 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871503 29612 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" containerID="cri-o://302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871514 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871526 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871540 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871575 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871590 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871620 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871659 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerDied","Data":"d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871694 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871710 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"903b634267c4d03a793b4b69c05dfc5d8a92f5023abbef9a61cb574e83dc2bee"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871723 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"97ebb249079488276478fc372daed13b36528171102a7e1e1aab77ef4823b2a5"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.871738 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"145e950b874609dccdd455c8b407e894b0b805b780d645f8b5cd83d4f2e5ddb0"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.872140 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.872207 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.872591 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.872659 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.872672 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerDied","Data":"ecee444cb43be9a1ddd0e2f81e61eca30a1f71fa25000dc8d0f2c58cd47a25c6"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.872977 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873143 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873166 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-84d549f6d5-ndc5t" event={"ID":"eb274dd5-e98e-4485-a050-7bec560a0daa","Type":"ContainerStarted","Data":"854a387c5ef79581e793cea9af478d1025e855bc5db72359436a67d2aec956ce"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873198 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873236 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873262 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873278 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873291 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"3172c0e1310fe85b6969d859dd1e6cb260c3622246f9e1dccc9e19331599cb26"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873304 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"31cfcd5f067b44ea0b87c0a5aa2636c319389034426644966114a2ca89dbdeac"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873317 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"27f0975af44b0e0c4b5a0b780624b9d03d718a16509a17ac2eda6856f398d1f3"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873331 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerDied","Data":"f043145314a3afe735f976ccbd7c9f8a4f3dcf66cb61565878dcf3c6a12fe880"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873358 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerDied","Data":"820218d0107a773549d9d05e3670e6a7a1f29fee796eba888ebf85af0aa9baa7"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873371 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"8e27b7d086edf5d2cf47b703574641d8","Type":"ContainerStarted","Data":"d9bb5c5486d0d4e5249742ebeb70024222a47fabc60cb40b1224fe9ecd95ab7a"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873382 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"a7d2563fae71ee703cf5765c15eba0a6c9d82c859e7cfd778f661a109011742f"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873398 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerDied","Data":"3ab6e11a56aeea1e1090df7b641d026b31da61efe18c78570e3645eea987cc4a"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873503 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"e566658e07da148fd8797db81c134475cfca0b856d369e98428ade2c9af55e1c"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873534 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-6f69995874-cwncj" event={"ID":"e90c10f6-36d9-44bd-a4c6-174f12135434","Type":"ContainerStarted","Data":"cd89b54332e25652976ac12f30441cd06118b9d3055fbb57b6418f135e33839a"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873547 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tzfl9" event={"ID":"57d19c68-b469-4a6f-abe0-3a0eff32639c","Type":"ContainerStarted","Data":"e933a6546f56cf1635a68ba4de166dde3fd5d42b041b6441ff790cd2b34fc43c"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873561 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tzfl9" event={"ID":"57d19c68-b469-4a6f-abe0-3a0eff32639c","Type":"ContainerStarted","Data":"40169c23a8e39dfd514592b517b187a7da1bb95cbece39d06e03420e9e9f21b3"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873572 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"c0d8ad30-feb0-453f-b652-36aafc7b24f9","Type":"ContainerDied","Data":"711f8121453bce45bc824e3d6883cc554f357db4612fdf30ce1a48bb144904ff"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873584 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"c0d8ad30-feb0-453f-b652-36aafc7b24f9","Type":"ContainerDied","Data":"ab9b7175a769b8cf6fa9915cafc7ebea7b9ffcfdeed5fdbb917a19708d7ff10e"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873595 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9b7175a769b8cf6fa9915cafc7ebea7b9ffcfdeed5fdbb917a19708d7ff10e" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873607 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"72c2ed5a-4b8d-4528-a230-fe608ea55ad3","Type":"ContainerDied","Data":"f7e4a7cf437941fedc829554051c6d66a149aed6e58ff2aaeede0de923f84868"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873620 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"72c2ed5a-4b8d-4528-a230-fe608ea55ad3","Type":"ContainerDied","Data":"7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873637 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ebabbcc152c2590dc658fb1f46bd235f46220d5cc0dd8b01757f71ab4189d17" Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873664 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"563b304ab3f329515fc5902dab431b332849f6c45ef58e2a8305deced768a9eb"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873676 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"70ee74642d29e1b42d7e6fc003b0e3b2856a367fa6f34938ca3ae34c1becb9ec"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873687 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"6eb559eb0a3ed0b8b690c290eaee82218aa692c28e9cd60b46d871d2d5f31da3"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873699 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"894668843cc379ef3af14af02bd8810c94cf8d250ae2116e3028573643bd5152"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873713 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"fbffa3403c0c76ed26f94a9db8da7c5c7eba0d12a27cdac70fa8e9b94ea5d1ad"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873724 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"77d9ff79f7e20893fe69fd62607ea644dde99bd686f4f6e583b7ceec2d7aec30"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873737 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"2ed0f3afa93a58287d466e23ce923764ef467afba9f298045ac789afa5a06a3a"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873750 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"78ffa2b3d97ad72d0c1fb4664edd94ccca3e077568d34392a2813e8d87310b58"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873763 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerDied","Data":"ffac3bdf0b92a0122fbb8d6a02eb01304705a3c19a78b9c431bf0840fcd9389f"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873802 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" event={"ID":"e617e515-a899-41e8-9961-df68e49da4d3","Type":"ContainerStarted","Data":"33f47ea2b86b83a56cf1822a75688636158a335eb806da8b836ae68dd857bbae"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873816 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"33626a3a03e494d53eb0b45e3f0e9d0fade66c1dbdbd8409a2239da74bbd981e"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873831 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"f723a5b08f2ef8557d64dea29d9c9752cf6c0d407586477022828b2cd1778ed9"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873843 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"bc3aceab17b3c160a457b9b51da1c86ee3e1e7b578b1ef933673c5b54f7b7534"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873855 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7bbc969446-dbm2x" event={"ID":"4ca74427-9df7-44ca-ae91-0162f402644b","Type":"ContainerStarted","Data":"9f7852cce265f4731f8e5a50d5da937e705ba1159369968578b0f921316baf94"} Mar 19 12:10:28.873616 master-0 kubenswrapper[29612]: I0319 12:10:28.873867 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"4bebd688c15bc9924a6633e91d629de13286a79430204ae499ef583b4192829b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874080 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"9fb4057cedb716012c53d5b5e73188baaa2d84c2bfe9a5b0b41a9d914c68c9a9"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874100 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"006c51c13746449d97f89fd32f5c66c1fd844fcadae3ad681b363c4e8d85d73a"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874113 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"afb68f1312817edfafcc486849026e0a22ee56001daacaa9fb327878eb9fdf35"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874125 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"3cc7ba3003b9018ab0a4e14b7866663c2d4ae06af289a9d62d0178f2b00749cf"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874136 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"fc11d745bcb55f0325836aa72854476958c768ae4522a0e6c750cb0c84bb6fcd"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874149 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"52e44a998be1c2142f3874b5ff4a8f40902b1f922d23a5db895af0f402eb6218"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874161 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerDied","Data":"fc1a5002078bd201b824e0cb01e13c7c268e7dc6df2889e483abb0d9ab95ab7f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874173 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"094204df314fe45bd5af12ca1b4622bb","Type":"ContainerStarted","Data":"ff5132ef0b49543e2892994f5f8254a3ce97fc30f20a77cbc390ce279601737b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874199 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerStarted","Data":"87a98c9fff8805e14bb0e09d73eb4c758fd6b46ec3819661dced16f114d43480"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874213 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerDied","Data":"064b176871fcbc879687ac7e9904eb1df9dbafb853b4d8ce41a9bbb9767a5a2c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874226 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6f97756bc8-9fvr8" event={"ID":"d52981e0-0bc4-4dfa-81d5-aeab63fa23e0","Type":"ContainerStarted","Data":"a938fec278d403193e581844106b17006b55efc9ca616fe3e7d8c448d817c442"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874239 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tz4qg" event={"ID":"65b629a3-ea10-4517-b762-b18b22fab2a9","Type":"ContainerStarted","Data":"a1b2ed2ffd2327cf8b909688fcb33a183368625a173ed7a0fd40651897a3bae2"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874251 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tz4qg" event={"ID":"65b629a3-ea10-4517-b762-b18b22fab2a9","Type":"ContainerStarted","Data":"ef597ecc3cb239a3f4a10a1a4b99299b44a320ef76f5f894771c1233b3937cda"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874264 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"91d2d9fbd93675cb6b8588342c80fd98749d4864eec3bdb5cf0aed0f65fff704"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874279 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerDied","Data":"a0078d1886890ad6f9b53500e856885a52b81c9832530c9463ebafbbef1dc099"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874293 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bd846bfc4-5qtx7" event={"ID":"b0133cb5-e425-40af-a051-71b2362a89c3","Type":"ContainerStarted","Data":"166b56b9c6c1434919d567d8df963a4c549223c203f79070beac456f5b57a166"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874305 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6tsg" event={"ID":"2452a909-6aa1-4750-9399-5945d380427a","Type":"ContainerDied","Data":"ec93bf01fde32bcc52dde2502ea534c23c521da7002fa0839646b010fe2c3b0c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874461 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6tsg" event={"ID":"2452a909-6aa1-4750-9399-5945d380427a","Type":"ContainerDied","Data":"cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874476 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cefa044e9278fa0f7e0364a8c119ab9567645a4d070ede851a0064c55299d2ff" Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874488 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerStarted","Data":"6af67af236dd0969ccd453e6a767e729b40cda7e647e2cedf72b5516dcb3f553"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874502 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerDied","Data":"7fc3d4238bc0fa4082c65174aac64034c89fc65c891756149ed46eacde6bb87a"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874516 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8c94f4649-q46ht" event={"ID":"d263ddac-1ede-474f-b3bb-1f1f4f7846a3","Type":"ContainerStarted","Data":"7478a36d7c10b347071336898ec1420b6e593642e73df5491b35d12f5dad7fb8"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874527 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" event={"ID":"bcc25a4a-521a-4139-a10c-d5b8bba6e359","Type":"ContainerStarted","Data":"59e44e1482a20a989757eaeac24f590e16924fc1f22b5b1d84ceef232530f2b0"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874540 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" event={"ID":"bcc25a4a-521a-4139-a10c-d5b8bba6e359","Type":"ContainerStarted","Data":"0bd48d8d40465277e592ebfacd5396ad3f4989dcc6911b25f2efff378d12a261"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874551 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-58c9f8fc64-vc6n8" event={"ID":"bcc25a4a-521a-4139-a10c-d5b8bba6e359","Type":"ContainerStarted","Data":"7ccb1bded92273b709b711951d032cf4685c9f074ee7f4072031bba69f2eeb3f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874562 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" event={"ID":"63e81c6a-b723-4c9a-b471-226fefa9c7a7","Type":"ContainerStarted","Data":"27a59a5a4197676669c34f3133e3dc3f21912e88cbaa5a4cb220db22fdb857c5"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874575 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" event={"ID":"63e81c6a-b723-4c9a-b471-226fefa9c7a7","Type":"ContainerStarted","Data":"df5e3a3b450407477b1d624e52a1245a526ec5f6a1d53e96d8498482a8ee4d97"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874589 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-744f9dbf77-9r2n5" event={"ID":"63e81c6a-b723-4c9a-b471-226fefa9c7a7","Type":"ContainerStarted","Data":"2ccf8a3de86952995f531b01f4ffdc152c7bc1633eadc15477217fefaa028047"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874603 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" event={"ID":"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c","Type":"ContainerStarted","Data":"5b15f71769cb166bbe1854fa275ab6231d488a761919449830f9713c4bfd28ed"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874617 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" event={"ID":"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c","Type":"ContainerStarted","Data":"4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.874631 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j2kp7" event={"ID":"f5148a50-4332-4ab4-af63-449ef7f16333","Type":"ContainerStarted","Data":"d3da346d2a31113c42b29b7bafb0dcf087a712bb2d1d78530c61e5aaded3b8b1"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875472 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j2kp7" event={"ID":"f5148a50-4332-4ab4-af63-449ef7f16333","Type":"ContainerStarted","Data":"27bfc2e7edbcc6c9da4d8e8c7defa74d634ff9325db83c99710558275c0105da"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875495 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p5wll" event={"ID":"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc","Type":"ContainerStarted","Data":"4672e75211f24f87471dd8f82717ac1ac6c96e914525bedb7525f49512ca1f3b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875537 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p5wll" event={"ID":"0fed1dcc-6fc0-41ef-8236-e095dca4ddcc","Type":"ContainerStarted","Data":"46f181cc8f14d83105f0e997edec7eec870d5b14b4f39f46ef15583420a1117b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875555 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec","Type":"ContainerDied","Data":"882b540d45f96873ea08eb024547ab884cf7ba3eaac3163e6d0d3525a76f6dd5"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875627 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"18c0e2ac-b5c3-447d-96b4-37d7d0d733ec","Type":"ContainerDied","Data":"23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875680 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e0fe7ade4ac6399005059c86578e590b8718c5b7857c0295aaa5d158549561" Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875696 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" event={"ID":"07804be1-d37c-474e-a179-01ac541d9705","Type":"ContainerStarted","Data":"e30aedaab1a7306da7baa3b2d889ab22c8768e0b6b637ef8386a681ef31c78eb"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875711 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-5549dc66cb-5twg4" event={"ID":"07804be1-d37c-474e-a179-01ac541d9705","Type":"ContainerStarted","Data":"8db1413c53086a5c6db2b83441d84f9f315ce4cbb909fde99b5104ffbf2df8f1"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875774 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"4d3040ecd2a7e633dfd078caf5a95302a7ae4f555f4e48340660df3b62be5004"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875793 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"1596d61ab954fe05c17045a5f0656bd06f879c1529f23b7eb004e3ce610f3ceb"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875809 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerDied","Data":"763a20d7b98822675af380cf6397cfba29e067a2eddf07fc09dc4a05e4d6fa6d"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875846 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"1249822f86f23526277d165c0d5d3c19","Type":"ContainerStarted","Data":"2204f34436396e98278c315659c7576573f08e9de65d82a40c1e4f2441bcf459"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875861 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"2aca712aa2aa40b581cebaa07826a29d6ba9e6eed682d665dfe4f86d2005740f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875873 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerDied","Data":"344d526c0d5c832dab03c63929de32c376b56407a1d6deb8f5c0fba230eec603"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875888 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"0e4b6062f9bb4931c27706fc37084b485dc72bc8ae6a89fcc0aec8ff5bbecaef"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875953 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-7b95f86987-ftz5b" event={"ID":"94fa5656-0561-45ab-9ab1-a6b96b8a47d4","Type":"ContainerStarted","Data":"9dd3eeed986a63782457294b98f428fdf5ac5bd98b40cab991676824cbd680b6"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875971 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" event={"ID":"4e7b04dc-e85d-41b8-959c-87b1b24731a1","Type":"ContainerStarted","Data":"b0d21e5f0486b201b3c06d890cda932f5d618ab09a01f42f7cc5e1ab3e144300"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.875987 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-79bc6b8d76-pdjtv" event={"ID":"4e7b04dc-e85d-41b8-959c-87b1b24731a1","Type":"ContainerStarted","Data":"3e041b219895e4a22c84b8979a6ff95eaead826c80bb84644fce5e3855731eba"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876000 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" event={"ID":"b69a314d-4884-4f2c-a48e-1cf68c5aaef4","Type":"ContainerStarted","Data":"2ad7b4f8961ebed8d5885645075f44c80d0763532cb9f34aad1f7b43a34d146f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876043 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" event={"ID":"b69a314d-4884-4f2c-a48e-1cf68c5aaef4","Type":"ContainerStarted","Data":"107d14368724098585e35bbcdeba06ce6763b263ff68a1c0db81b1b11f0c59b6"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876057 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"de6765b7-3576-4171-972b-20d62d81f25d","Type":"ContainerDied","Data":"f6792bda14390e5172a259445357feef34047b3b4035a2f72b67e0cd845a049a"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876077 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"de6765b7-3576-4171-972b-20d62d81f25d","Type":"ContainerDied","Data":"503bc58279f464c4d0d4ae650c56c03affe2a73598df280a9e969caf566182ed"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876110 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="503bc58279f464c4d0d4ae650c56c03affe2a73598df280a9e969caf566182ed" Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876124 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" event={"ID":"b92c4c09-a796-4e00-821a-32fc9c8ca214","Type":"ContainerStarted","Data":"45c789365c80ca55d3226f6deecb6d6efc3f93acf4b17785efdd5266a6618954"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876137 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" event={"ID":"b92c4c09-a796-4e00-821a-32fc9c8ca214","Type":"ContainerStarted","Data":"6a323fe5657fde9c52d45dc374ed9dd1c7bbeca037d1a14be52c107d4d949809"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876150 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c8df6d4b-tbjxv" event={"ID":"b92c4c09-a796-4e00-821a-32fc9c8ca214","Type":"ContainerStarted","Data":"2eac945a64b009901141c6c6e11ad1de23d05d8f601554a1ef6d7d7a8ce9169c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876163 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"9b8bb7056828913959d244048b2d339d012a91e455be453f7da59e9b6a47621a"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876206 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"ba3633214cc572239baa8cdfe50f5588ac58b20470e03aa47725744142808ca4"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876220 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerDied","Data":"285371d8dc1391b2aefa34f9aa23ab25dae7de5c3d27157be1b42472a35ec621"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876233 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-b4f87c5b9-5sh6j" event={"ID":"1f7d0b6f-91bf-4e6c-a090-77705ddc8bf6","Type":"ContainerStarted","Data":"ce92bb2deaa819fea113e3e46f2d6ce13e1b77b5e6309af9c4cd6c384874e217"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876249 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"e07058ba721733098fb4279e6e48522183a341fb7e3449f5a7693aab50f42f9f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876291 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerDied","Data":"a8932e5c8c4dcad1c7f461d22290ccccb070a0be05cbda27e90810a35028dcec"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876306 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-64854d9cff-6dlpw" event={"ID":"0c804ea0-3483-4506-b138-3bf30016d8d8","Type":"ContainerStarted","Data":"731fd6a0a7a380afb3c5fb06f8e5e559e768934564ab7fa113864f10ea901855"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876319 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"6a205ee3ed4a769ac518763e05a3f6447d834f56c960144a073306636619a961"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876359 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"cddea9ae351cdfdd1985ab5fdb11762d077c83cdb329eae761816bc9c824a2a1"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876377 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerDied","Data":"fbe4d38a848a8f8c8ac70c6aaafe195a243cd98f3045f416707deb69b549d12e"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876391 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-57777556ff-vxsl4" event={"ID":"60081e3b-4fe5-42a3-bfae-0b07853c0e36","Type":"ContainerStarted","Data":"514e629483d9ce3a471489bf8ebd7e39de0e8072f7a78d8b93a7ef5b4897dfbc"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876402 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"02b3249f9637d6e6358afd5f27698b229ac5de189524a1cbe5bec5cb0b7e2803"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876444 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerDied","Data":"9ec4a3ca5402b255a5776345ac4dec708163999648880ceab54090ff9aba763c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876462 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-b865698dc-czzvw" event={"ID":"3f3c0234-248a-4d6f-9aca-6582a481a911","Type":"ContainerStarted","Data":"f69cbe55fe655a8ec807171d1c92ba5b87bc09fe1df87bdf00cabcd0bab163dc"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876475 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" event={"ID":"131e2573-ef74-4963-bdcf-901310e7a486","Type":"ContainerStarted","Data":"5e5f7fc3ebf0c67acc7c96961d4c2909a3186127871fb38ba8defa7ad1c517d5"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876489 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-58845fbb57-j5rwg" event={"ID":"131e2573-ef74-4963-bdcf-901310e7a486","Type":"ContainerStarted","Data":"6f9d7a4e397f58614971bf45c37c0758f1dd839eda26eeb6acda239cb4547bd3"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876531 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"4f882838c61c3b7b61dcc60feb2b4a86a11287b46f9c3c99faba9a5cff759b8d"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876548 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerDied","Data":"f7738588625037912affdf5435640d17413e9cd9036722fe4bd36750bce1dc90"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876565 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"9285536db09b88ca539506b964929ed4bab9e9f5af82602c25ae73597e6b0465"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876578 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57f769d897-lhhbj" event={"ID":"518e1c6f-7f46-4dda-84a3-49fec9fa1abe","Type":"ContainerStarted","Data":"05ec088c819ae1eeb5148aff6d70b93f3b441541316091762e2b98d617be204f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876614 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"31b268e0c623a15f5f95903154798432e7236486fabf05d30051e9c80ebbf93c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876676 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"a8fbcf038f8b67376900fffa9726e27e9af1c3c213a38a857a8441750c40f894"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876696 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"35e09c0adb6fc12f6eca598398ddbcf71d34e7200dffbfb6bdf2fe668ebaf179"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876710 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerDied","Data":"1807478431eec6d4ebe9e175a729aeaa97be5330ccb6e1fb9986d071c23571b4"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876750 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerDied","Data":"79d5dba168afeffdccc6ff9b4390e9029f185749a05da6ead57ad3001b8c3183"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876768 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7dff898856-8pnq2" event={"ID":"e47d3935-d96e-4fd2-abfc-99bff7eac5e0","Type":"ContainerStarted","Data":"67caa8f56a1602e423b6488bd40c30339516056ecee5a3793a01c82042f70327"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876782 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" event={"ID":"5b46c6b4-f701-4a7f-bf74-90afe14ad89a","Type":"ContainerStarted","Data":"1e71a71430ecca74b1dd6f7eede647761f2774d3e5737d1b310c911f67fbd631"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876796 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" event={"ID":"5b46c6b4-f701-4a7f-bf74-90afe14ad89a","Type":"ContainerStarted","Data":"fb94aad616f605f0c5b2b89174012856582d923dd7eb37448d29cf2ee0f920af"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876839 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" event={"ID":"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0","Type":"ContainerStarted","Data":"ca4a26017dcd0ac12df116562f9aac9730227426618d755dab28005b1e97a4e7"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876853 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" event={"ID":"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0","Type":"ContainerStarted","Data":"2d1796d1dcba5c87f0e3bf9e86d786a2f91769474b390114a0045bbf4895110f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876865 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tmgtz" event={"ID":"8f356b2d-6d9e-4ec5-bab5-9bb48322a6c0","Type":"ContainerStarted","Data":"0dda2db92925393742e73de4713061d6f57b6007ca6270c56cb0bd0c372a97ea"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876877 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerStarted","Data":"95cccba8c9993fe9b1eac34b42b9d8d0ed1d42c6ac23c7711fd331a30c761a8d"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876915 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerDied","Data":"ff2ebf3fa6730575b9447497330c1d66a64543dc66430125663573c61143b763"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876931 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerDied","Data":"9340a8d37e83e7908c779e917b656b3529e86ab0084fd52b34de9d95fe7b99e9"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876945 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ql8d" event={"ID":"d979be56-f948-4252-ab6b-9bfc48dc4187","Type":"ContainerStarted","Data":"b1eb508b26c0e3a04c674bb8b166ab63a387c4e22fe8322ed85b0094f61f22e8"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.876958 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"d82d805964cbdcb6ff5d3bc38ec895ff8e1b88cc59ea6f6156f6d90acf9375a4"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877289 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerDied","Data":"3d22a1705e322b07bf63dc93baa238f057583ee120dd5b2d2d08b50e9d7a1b09"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877309 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"8085ddcc855855663d2f5770a76d3c79d6914a60f257f2223e1616226984fe66"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877321 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" event={"ID":"58a4e3af-c5e1-4487-92a2-81dfb696695e","Type":"ContainerStarted","Data":"5bea1d036c9faad13c1d14003487235e6d8ac60afd384e78842d83618960ce00"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877361 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" event={"ID":"f65be54c-d851-4daa-93a0-8a3947da5e26","Type":"ContainerStarted","Data":"ea563cc47199ce68f6d63dcb8538eab7a3283fca0aa6f632680e63df560c6774"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877380 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7d58488df-4fpvq" event={"ID":"f65be54c-d851-4daa-93a0-8a3947da5e26","Type":"ContainerStarted","Data":"294f59c88ee9d0ef672e140bfaeda4c298629a073fa82f34bad6f87890b02c82"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877392 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerStarted","Data":"5715f8e1d312ac24abd3853bc91bc1dc47d69fdf6e3f8c0fa99413c9a22da64c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877405 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerDied","Data":"f4990111c5770baec2a2ee082f291300cc7420fd0573bb028eff8f0a2e8424eb"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877439 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerDied","Data":"d8055526d32c6d4c79231e8f49c9d8851d0cddeac9c1e90af414749e5818720e"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877457 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" event={"ID":"66b03ae9-d7a4-45fa-8e41-c7b7fd64b63e","Type":"ContainerStarted","Data":"da05a78204256ccd4edc262bb9b930e6804a73e9ac544e9244d9c5d71173e2a9"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877469 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"838a469d-2c27-46db-9d6f-17bfa1c8dd09","Type":"ContainerDied","Data":"bd4cbdd654e294cf12e1078774a4064deac39d97874d1f397bebdf9625d6fe36"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877484 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"838a469d-2c27-46db-9d6f-17bfa1c8dd09","Type":"ContainerDied","Data":"7d03f7fa3a14d9d32f0d91cb4cc6bc9bd274055b488260ab0433e18c2052b2ac"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877515 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d03f7fa3a14d9d32f0d91cb4cc6bc9bd274055b488260ab0433e18c2052b2ac" Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877528 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"890c19caa46394b533dbc648540873c8fe9d3b463f97ad09cc30eec2d6354bff"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877541 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerDied","Data":"d743fbff3bc780b360a4927f7a272f0fb5047ed94867c7848a01cf51863441ae"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877554 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-d65958b8-ptjr9" event={"ID":"84e5577e-df8b-46a2-aff7-0bf90b5010d9","Type":"ContainerStarted","Data":"9246f0557504641b26756d5dcc3facbda0dfa6d1148c91b577840d8631f86b2e"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877566 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" event={"ID":"78d3b2df-3d81-413e-b039-80dc20111eae","Type":"ContainerStarted","Data":"7a11cd309bf0aadb171ec1091d9f6745f904977929ac03bf37c565cc447050f5"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877597 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-598fbc5f8f-vcxdb" event={"ID":"78d3b2df-3d81-413e-b039-80dc20111eae","Type":"ContainerStarted","Data":"b8b47e7cae81e2e8fc5d1fdd94e1150be80c99269a023622d4d29934167c52d2"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877610 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" event={"ID":"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf","Type":"ContainerStarted","Data":"b7626e54b5b928b45b3e5c53bd0e84b9d119a3c6eb3a52b158926e0cf76cc9ef"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877624 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-68bf6ff9d6-jn2px" event={"ID":"8b4a2e47-e85a-4474-a1b7-3eb85283dfcf","Type":"ContainerStarted","Data":"f233927f2993af718960afb2ae363b672383e871d2174bdf0f0ce3e81e24474f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877663 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" event={"ID":"07eef031-edcc-415c-91d7-f9f3bd0a45e3","Type":"ContainerStarted","Data":"4e8446e16d9a3735605f6f6ec51d20f491b3eb101b6c3fd3ca5dfd89335292d1"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877678 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9w65f" event={"ID":"07eef031-edcc-415c-91d7-f9f3bd0a45e3","Type":"ContainerStarted","Data":"b09ba92792d71b97e085101b3f5f0d26504a7ce27d6d2f696d8e3319844365b2"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877689 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j6n2j" event={"ID":"64136215-8539-4349-977f-6d59deb132ea","Type":"ContainerStarted","Data":"eaa4320256fd1e5782890c1a2e18f3c707f378b1f8c9a5b0ab85a3c02fed3649"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877700 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j6n2j" event={"ID":"64136215-8539-4349-977f-6d59deb132ea","Type":"ContainerStarted","Data":"46de87bd542fc48b4c2304420a056b910916afb6d58cd22606b6fcceab380367"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877712 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"a23a6b5358beeaf5e645643ffb405a80fb4cf54ade8b5a4f38d294751cca2a76"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877749 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerDied","Data":"7fc5e15dbb622b807625a944b8961634a70f1aecfe4e4c996f59e0fe79999af9"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877766 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"ac74b38fd11c535ddb03b556aacc67450651b8de55a732a49658d786ebe0a734"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877777 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5c6485487f-b2w5m" event={"ID":"81ba5bc0-fec9-41e5-b3d9-df17d7f1bd2a","Type":"ContainerStarted","Data":"4f20fa38da6728fdc70e1b66426e311b675e090aa8e87afaac1c37a5210cd1c8"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877789 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" event={"ID":"fbde1a69-b56c-467e-a291-9a83c448346f","Type":"ContainerStarted","Data":"5e1c65e27dc3f0a6a757a9d44783d533d108753111fcd05bc2b04cd4929ec893"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877852 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" event={"ID":"fbde1a69-b56c-467e-a291-9a83c448346f","Type":"ContainerStarted","Data":"d9f2dee995f42cd5f80f8d3478e2f0c9d499af83bd740001cdc2fee3f2f3dbde"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877868 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-6fbb6cf6f9-c99xj" event={"ID":"fbde1a69-b56c-467e-a291-9a83c448346f","Type":"ContainerStarted","Data":"0c58fd5ea7dfc487f516ab9f02025ffdb09d90a9ab5d62ff3f2b99a062549935"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877907 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerStarted","Data":"416d98dbc9819bb0bc11ccd6915d47895190d1c6a05b913f67a9dc85ee34a4a3"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877926 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerDied","Data":"2ae789390c2dc2661b322d4ce95cf71fe928a1384e00118bd9c25a6d60f35352"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877938 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5885bfd7f4-vt498" event={"ID":"162b513f-2f67-4946-816b-48f6232dfca0","Type":"ContainerStarted","Data":"28c9e269179e8f2b2c8470e0dc5902c0174b162f0c6e40253ff15a0c8e960364"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877951 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkd77" event={"ID":"3e1b71a7-c9fd-4f03-8d0c-f092dac80989","Type":"ContainerStarted","Data":"a526f0ca6aa3cc2a180780d0441d60c91cb569e7abaf7f74b844db9d23815ca8"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.877989 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkd77" event={"ID":"3e1b71a7-c9fd-4f03-8d0c-f092dac80989","Type":"ContainerStarted","Data":"779dc5b5369bd95c8f8ad0170a9271f13ea8126f017abb36186ce64bcd027dfb"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878004 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nkd77" event={"ID":"3e1b71a7-c9fd-4f03-8d0c-f092dac80989","Type":"ContainerStarted","Data":"e5b8532c133a52bef4f3c34e53a76d2b97e7bbb39e5937c556c2dbe0cf940780"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878019 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerStarted","Data":"6247fd87460d07ebeb5aa88a818055c6dd5c137e669ed6cad1d8a519f682618d"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878033 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerDied","Data":"8cf7c92799c042639046f9139208de62b93f5c3c4be8bc7881686ff116d251d3"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878069 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerDied","Data":"b8135311023b73f1dff2f33a6eb221f0fcb0bdc9e1cd016fb390a5b4192441f6"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878084 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" event={"ID":"4705c8f3-89d4-42f2-ad14-e591c5380b9e","Type":"ContainerStarted","Data":"153273eefe31c48a690ffc5c0ad5ae91d19fbcce6af4622e890f0127bcac992e"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878096 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" event={"ID":"a5d009a3-5ff2-49f9-9cea-004b99729b77","Type":"ContainerStarted","Data":"7181b8b01999fa52cdbfd11c75a2b16463e9e25a0cd9e66237c938d0d2ec42d5"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878112 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-7d87854d6-qzxkm" event={"ID":"a5d009a3-5ff2-49f9-9cea-004b99729b77","Type":"ContainerStarted","Data":"414a48d3cf2977fd972a56aeae2697291bc4403c105ce776bfadef125a18facb"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878144 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerStarted","Data":"ffb9eefebe45bd6f6b90e5cfb48390a803f89852e6c65d7be29a92976dc945b7"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878160 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerDied","Data":"7b4959df4babfe207d0b7474ab31deaecbea2e2b9d22f6c7f79a7c10a529fc14"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878172 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerDied","Data":"237540f70e2c29f3039bdaf2115302712ba9278f06779f1d068dbc0f51f6e8cd"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878182 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h84v" event={"ID":"24bd6a9b-9934-4cf9-b591-1ab892c8014c","Type":"ContainerStarted","Data":"d318202a66572196e0681affd0ab554e8fa6ff748647c2d4a3aedba6fee2ec96"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878194 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerStarted","Data":"30addaa4982d26bdfb258a1bbf60fe2e386d618f7ada4f3702760789c65504d1"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878228 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"61ceef49c4b3e3c92de2f92ba07df5cc9f3437ed0a986fcdd8cd9a85e9afbe78"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878243 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"c0d30e67fd289a28538ef92cb1f3d25d05768acb2306cf21bc6d90958cdbdf09"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878258 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"ff56157a2c5ab60201123fe865c3891754204ecbb4e8ba10184f66ac03510978"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878269 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"20f4956468238d356cba6c4e8281a37386fede658fd1e93306b89426a80c8130"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878300 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"57e2b9e5df1e4dbc4f600dfa7e9acded55e04fe34916a6250fcf8bdcd8821c9c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878314 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerDied","Data":"7e880bfa7477300228e9f011e131c05d9e3d415f1d3d007729e1d9e7b578b4cc"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878326 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n5wqf" event={"ID":"c479beae-6d34-4d49-81c8-39f6a53ff6ca","Type":"ContainerStarted","Data":"cc5d3dd72022502d64692c766ea5e10f49edacc520e37b4968118ca3c399a379"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878337 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" event={"ID":"8117f6b0-0b9f-4a1d-9807-bf73c19f388b","Type":"ContainerStarted","Data":"79954f84dfc2314904bbb3f98c0c06dea0bbe35428696cef252929fc38f3689e"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878350 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-69c6b55594-52ww6" event={"ID":"8117f6b0-0b9f-4a1d-9807-bf73c19f388b","Type":"ContainerStarted","Data":"444bcf401c29be212027c38d60d3155ce14ee9b37c07f012dc81ebccfc8c5a7d"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878382 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerStarted","Data":"fc31578b7ae316f99c92805e4da52c101e78e491450534381d082992bf61e52a"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878401 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerDied","Data":"15fc833de80b2cae87e3ef1229c5aae30a9051318a9e0831bf339fc664c1dd1c"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878413 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-dddff6458-hgwgj" event={"ID":"bea42722-d7af-4938-9f3a-da57bd056f96","Type":"ContainerStarted","Data":"4cea9c5260823fbd81d53a50c86a7838135f8c5ff405a05f4d12cdbb4fd9069e"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878425 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"01529a13-e1b1-40a1-9fff-8a20f1cba68c","Type":"ContainerDied","Data":"825551a6cbd6e2d9a62ce681600e2897018550eae2ad6b2a4c579d2b878da6e9"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878437 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"01529a13-e1b1-40a1-9fff-8a20f1cba68c","Type":"ContainerDied","Data":"9c2f6ee2f4f5a8055122de400fd9243193e98b0769b1970bf4b2bfb820983158"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878468 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c2f6ee2f4f5a8055122de400fd9243193e98b0769b1970bf4b2bfb820983158" Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878481 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerStarted","Data":"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878493 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerDied","Data":"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878504 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerStarted","Data":"9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878515 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerStarted","Data":"f7ff1ad801ba3c97ab9cdbd502c5a44b3579c7a5686bfae47cda73c6ece4216d"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878548 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerDied","Data":"023b8459d54078f874c794a42aed9a6fd6fbb4ddfa6be97808a092ec4ffe3187"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878561 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerDied","Data":"67dee3634bf8afc5cf33505c6686845e34de0d09fbd040277b4cc0507631dc5b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878572 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zxlv" event={"ID":"4d625e92-793e-4942-a826-d30d21e3ca0c","Type":"ContainerStarted","Data":"fed06e56f643611f5a7965ed4f8766bb3618f49ec311e6c29d1c5641de5af6fe"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878588 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerStarted","Data":"531e1af91168e7c40c25cd45e3b2a8c46d00c5c08a0f607a449f4726fdf3817b"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878618 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerDied","Data":"e2e764dd4d611abac99b99d586297942d72b2e3f1434058cd7e0a11729f21ba8"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878675 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerDied","Data":"0d16f994259c86f779a870061760bfaa564ec639686f9c38a22c5ca129c5726f"} Mar 19 12:10:28.888814 master-0 kubenswrapper[29612]: I0319 12:10:28.878691 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzckk" event={"ID":"003e840c-1b07-4624-8225-e4ffdab74213","Type":"ContainerStarted","Data":"cad025d780875a4ff5ddd5b35a2f470eaade918be2cd73b0ab7727feccc93241"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.878703 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerStarted","Data":"20c63945e832497a7fd0819ea7c26a11a69792158beab08ae38dda25ad0ee6af"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.878717 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerStarted","Data":"9c450dc363933ea47014a9ef4addea98317470b78816983a716d19f7d723e91f"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.878788 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerDied","Data":"3b0c232c84654528ea64bdd18d2136082e46bb45cce0608d6a300d9ff0fe9984"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.878830 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d48mb" event={"ID":"5f2d1d03-7427-45a0-a917-173bf9df9902","Type":"ContainerStarted","Data":"64f8ff5be7bb724395a941290a38553d4d3a994e6e3322cf64e60c9441115924"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.878846 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerStarted","Data":"6c401036ffac2cfdbd3366c4981a259963d196c9de9faa702ed7fb448793228c"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882408 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerDied","Data":"ec872b5060e715523dcf7612ce03c05df430a828612dcdbf8f83f43994a2ae84"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882464 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-8b68b9d9b-4tqrh" event={"ID":"367ea4e8-8172-4730-8f26-499ed7c167bb","Type":"ContainerStarted","Data":"f66b3d91451352dfa89636c0bda6df9983fe04ab531bc054759d8ab19287dc35"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882627 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tz4qg" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882680 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" event={"ID":"f93ee7db-f0ed-470f-a458-86c54e6d064a","Type":"ContainerStarted","Data":"d568f722cd6e7340e98eb715f4b1221131491d92ea5892b562e44ef11c0035d7"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882735 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882812 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-95bf4f4d-9bqjr" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882828 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-b4bf74f6-5jjzp" event={"ID":"f93ee7db-f0ed-470f-a458-86c54e6d064a","Type":"ContainerStarted","Data":"e4eb5ff21caa542685bdf5f202cb97c48c846bd09976e5d64727ae00a75371c6"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882844 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerStarted","Data":"2c48d09a5f376ea01a7df4083c720e54288e0edcd8e095a1b16617d5d890a246"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882859 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerDied","Data":"3385d26311ce81e00023dfaf9b5057451bdc1320414436c96c2f92229792ca5f"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882872 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-8544cbcf9c-bj75g" event={"ID":"aab47a39-a18b-4350-b141-4d5de2d8e96f","Type":"ContainerStarted","Data":"b179219ef24884bfcb110bf155456a2ca0b90c733780b9ebce585f74be18971b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882886 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerDied","Data":"2585e3c8a21fe77ef0a54c935bf49a872dc17099314a87be4e929dac1b0cd111"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882903 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"0e52d93d005b0e420b3ffb2e9181ac41115554d0386bf658491e57c893340a8e"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882916 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"ea9e9d1680b5aa838fb75a46daa08b37fa0baf3eb1a7aa70313d288a46e0ae6c"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882927 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerStarted","Data":"54294092abe32456d3b18b9bdddc46fd2b3ebcf51048219020a08440353bf3ae"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882939 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerDied","Data":"016ff43860287ef9ea9c7834d6813f132931ea72467cbcc206c5b8bf5a2a1e8b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882957 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6bb5bfb6fd-plmfv" event={"ID":"6d017409-a362-4e58-87af-d02b3834fdd4","Type":"ContainerStarted","Data":"d0c18b672eeccf56227de74bfddbdb7bc277e8432c83dff62ae2a878becfbd41"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882969 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pgl92" event={"ID":"a5cf6404-1640-43d3-8018-03c62c8338c5","Type":"ContainerStarted","Data":"d42385be491bfadd3c061b3f36ad26a198c0870f8b5d2bbe763f20ae41fee79a"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882983 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pgl92" event={"ID":"a5cf6404-1640-43d3-8018-03c62c8338c5","Type":"ContainerStarted","Data":"0d98adb678ff894fda362481253a39cc289070f56a3c0fa8ab4f7475629152b0"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.882994 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pgl92" event={"ID":"a5cf6404-1640-43d3-8018-03c62c8338c5","Type":"ContainerStarted","Data":"a77515f16c17046f407c9f07e5609964e9814a68e8f45a75bb351406b4dcfc20"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883513 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x66j8" event={"ID":"a49a8a5e-23fb-46fb-b184-e09da4817028","Type":"ContainerStarted","Data":"a8c7d4a5bc868c6b8326143c35e5db20aa58db16a5dbc92a5b41ae595b41f4c0"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883538 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x66j8" event={"ID":"a49a8a5e-23fb-46fb-b184-e09da4817028","Type":"ContainerStarted","Data":"a0c37fe14e6ead412d8c2590ada12be7a2da79d1d1f7b5edc5d8494f1a42a084"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883551 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerStarted","Data":"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883592 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerDied","Data":"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883608 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerStarted","Data":"a297c2fb1d857d07bfeb6ac6ca57bb65e15ebd85864af6b5b40028961499f08b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883621 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"7972279b24eb7f8c4dea473b490207bcfc9b6d387850efeb2be0506a615c093f"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883672 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerDied","Data":"6efb9a3f210b061f188c57eb6303722d321f0f70ab8c3b6975d1007d3830e3ef"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883690 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"02574594c73f86272f0873c80450519dc1df034fda7734dbd2138fffd21a3c3b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883702 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-47j68" event={"ID":"51e2c1d1-a50f-4a0d-8273-440e699b3907","Type":"ContainerStarted","Data":"a0a1f456d58f5fe3ea6e2a83bb8132e1e74b27ab4f7de8d0d24d1771a1a6887f"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883715 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" event={"ID":"4d790b42-2182-4b9c-8391-285e938715cc","Type":"ContainerStarted","Data":"7d607bc604afb38bbd70378afd53bf1d6787e2bd2b574ea5a541d63fd9c62121"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883759 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" event={"ID":"4d790b42-2182-4b9c-8391-285e938715cc","Type":"ContainerStarted","Data":"c0cb9d988f8fe88f88f11a1292927add52343057f8dfc3f622fbc5e70249c9ae"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883774 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-8487694857-52lv2" event={"ID":"4d790b42-2182-4b9c-8391-285e938715cc","Type":"ContainerStarted","Data":"6b5887fbc5e32ebc2f94544f4a59e3db7e9426484412d0d7e1662fd8761f75d2"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883787 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" event={"ID":"99ffcc9a-4b01-439a-93e0-0674bdfd32ec","Type":"ContainerStarted","Data":"a148f6c3b7285c6b23a088afec6a5093a43e8ac079cf43adfcb33eec79989b03"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.883802 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5f5d689c6b-8nptp" event={"ID":"99ffcc9a-4b01-439a-93e0-0674bdfd32ec","Type":"ContainerStarted","Data":"fc7608745c820a5982e84b8da2f7efb84edcce40d138c9eddb94055840313768"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884060 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"ab7fdc4501825667c1d30afdaf806675afdeb81ef5c5029c7fc21f6a98742e71"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884213 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerDied","Data":"c8521feb576750f4b4cf92dbf1c496a19af50a44e812267935ec3c28f71bb348"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884230 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-ff989d6cc-8xkv5" event={"ID":"ac946049-4b93-4d8c-bfa6-eacf83160ed1","Type":"ContainerStarted","Data":"d8f0e6a1ec121bbc7ed73597f1167b65da5617d6a824495c565577e16dc06d7d"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884242 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"44d258c9-f8b3-4233-847f-bc594a2aa4f9","Type":"ContainerDied","Data":"64567b7ee5dbb1259b892f9e1f4fd87cb14b22e497b596ff7c977ffde93bc8bd"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884257 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"44d258c9-f8b3-4233-847f-bc594a2aa4f9","Type":"ContainerDied","Data":"31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884680 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ec56fb2316131201c360b4c07233afd130ca3089ceb18077f9033121d6ab17" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884701 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"85ed437129c34727828994007c76166519f0d49369f91dbf9d5e4dc1693434d5"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884770 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884785 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884822 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"dfc697cd77881315e28c4429b102a58e90448d7960e79b2437cb6fdd1f387a62"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884841 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"61ee084e983f277942e90341d08d9836dacf57a63fdca8d267638b871e818f19"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884853 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884866 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884900 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"088cbe88ab52727a6e61abb80a2ff6d59de2b9b87fe822e87c7612a6a67923bc"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884916 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerStarted","Data":"fbdc790b03bdff951c38ccc0431bdc0b9ab80a50a61e36054192578c5f30e4de"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884931 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerDied","Data":"9d5a5757181aa7140885d0b2bdc37c14b8a849c385bb89cb00edfe5781ce2cdf"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884944 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-89ccd998f-4h6xn" event={"ID":"39f479b3-45ff-43f9-ae85-083554a54918","Type":"ContainerStarted","Data":"d405214a0101e6f5383ea0ec9dd8d0328f92f63d0275413db715a759bd43a1b8"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884955 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"1ec99218-1644-461e-abb2-14d0c5369e96","Type":"ContainerDied","Data":"16da51ad34fc7222c4ea68656e0145a3d1eba549a2e561b21ed00a07283c919f"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.884992 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"1ec99218-1644-461e-abb2-14d0c5369e96","Type":"ContainerDied","Data":"f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885002 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8905320489e881a81a42f634618f6b131a28e4ffcb6733be10fc6117bce6479" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885013 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" event={"ID":"0f2278c1-05cf-469c-a13c-762b7a790b38","Type":"ContainerStarted","Data":"4cbcc11104b1dd8e7e5c1754bbcb53f46a0043fca70876d066f039a1531022a4"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885032 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" event={"ID":"0f2278c1-05cf-469c-a13c-762b7a790b38","Type":"ContainerStarted","Data":"553238f28fef1cda6c72bfa4bf9821b26a728290573ff1064951e4a70d55ddbf"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885070 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-866dc4744-l2whx" event={"ID":"0f2278c1-05cf-469c-a13c-762b7a790b38","Type":"ContainerStarted","Data":"a99d279c1652e2b6be126b985d6ebabfca8ecedc501010b1c82126f8611b9e4b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885084 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885098 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerDied","Data":"302da25d88feed044717793a912a25df7753b2c33b9c9627185444f708131b30"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885115 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b45ea2ef1cf2bc9d1d994d6538ae0a64","Type":"ContainerStarted","Data":"5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885154 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"1044262ff415484653556cf7ff298d5c42e6df054a66a5ea6fc06f00ef84999d"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885437 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" event={"ID":"46108693-3c9e-4804-beb0-d6b8f8df7625","Type":"ContainerStarted","Data":"c35336e0693b1b7bd899f762b48a3f5a062ef0d2ac8ccda500bb69c9b7df2281"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.885464 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-66b84d69b-59ptg" event={"ID":"f38df702-a256-4f83-90bb-0c0e477a2ce6","Type":"ContainerStarted","Data":"3db515de43356c32e909b833d71cb7634a25f514027cb6e1a0acaeb3d82c7298"} Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.886112 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5c9796789-jsp5b" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.886584 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68f85b4d6c-qldxf" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.886629 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-6864dc98f7-4np9g" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.887114 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pgl92" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.889343 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:28.897900 master-0 kubenswrapper[29612]: I0319 12:10:28.893848 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-86c8b5dddf-j95tg" Mar 19 12:10:28.901523 master-0 kubenswrapper[29612]: I0319 12:10:28.898204 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:28.928214 master-0 kubenswrapper[29612]: I0319 12:10:28.928149 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.931441 master-0 kubenswrapper[29612]: I0319 12:10:28.931286 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.942808 master-0 kubenswrapper[29612]: I0319 12:10:28.942768 29612 scope.go:117] "RemoveContainer" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" Mar 19 12:10:28.946968 master-0 kubenswrapper[29612]: I0319 12:10:28.946899 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:28.959223 master-0 kubenswrapper[29612]: I0319 12:10:28.959172 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6q6" Mar 19 12:10:28.959424 master-0 kubenswrapper[29612]: I0319 12:10:28.959242 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:28.970958 master-0 kubenswrapper[29612]: I0319 12:10:28.963315 29612 scope.go:117] "RemoveContainer" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" Mar 19 12:10:29.001228 master-0 kubenswrapper[29612]: I0319 12:10:28.998056 29612 scope.go:117] "RemoveContainer" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" Mar 19 12:10:29.015388 master-0 kubenswrapper[29612]: I0319 12:10:29.014520 29612 scope.go:117] "RemoveContainer" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" Mar 19 12:10:29.015388 master-0 kubenswrapper[29612]: E0319 12:10:29.015054 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": container with ID starting with 694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885 not found: ID does not exist" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" Mar 19 12:10:29.015388 master-0 kubenswrapper[29612]: I0319 12:10:29.015100 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885"} err="failed to get container status \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": rpc error: code = NotFound desc = could not find container \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": container with ID starting with 694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885 not found: ID does not exist" Mar 19 12:10:29.015388 master-0 kubenswrapper[29612]: I0319 12:10:29.015253 29612 scope.go:117] "RemoveContainer" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: E0319 12:10:29.015529 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": container with ID starting with 400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072 not found: ID does not exist" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.015554 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072"} err="failed to get container status \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": rpc error: code = NotFound desc = could not find container \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": container with ID starting with 400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.015572 29612 scope.go:117] "RemoveContainer" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: E0319 12:10:29.015781 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": container with ID starting with 450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4 not found: ID does not exist" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.015802 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4"} err="failed to get container status \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": rpc error: code = NotFound desc = could not find container \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": container with ID starting with 450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.015820 29612 scope.go:117] "RemoveContainer" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016007 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885"} err="failed to get container status \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": rpc error: code = NotFound desc = could not find container \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": container with ID starting with 694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016023 29612 scope.go:117] "RemoveContainer" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016173 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072"} err="failed to get container status \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": rpc error: code = NotFound desc = could not find container \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": container with ID starting with 400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016194 29612 scope.go:117] "RemoveContainer" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016414 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4"} err="failed to get container status \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": rpc error: code = NotFound desc = could not find container \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": container with ID starting with 450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016433 29612 scope.go:117] "RemoveContainer" containerID="694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016659 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885"} err="failed to get container status \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": rpc error: code = NotFound desc = could not find container \"694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885\": container with ID starting with 694700ac69f55baa5b052d498e3bf6ff8ed1cf62bf4f1aa7bd0950954b37a885 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016679 29612 scope.go:117] "RemoveContainer" containerID="400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016854 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072"} err="failed to get container status \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": rpc error: code = NotFound desc = could not find container \"400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072\": container with ID starting with 400ece4a2da6d3f37d8dba843caa6809a9ba737c8548558e70c18325885b3072 not found: ID does not exist" Mar 19 12:10:29.016964 master-0 kubenswrapper[29612]: I0319 12:10:29.016875 29612 scope.go:117] "RemoveContainer" containerID="450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4" Mar 19 12:10:29.018180 master-0 kubenswrapper[29612]: I0319 12:10:29.018089 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4"} err="failed to get container status \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": rpc error: code = NotFound desc = could not find container \"450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4\": container with ID starting with 450324de6898abf67a817eac28d1d86ead0de12518fb2994d13239322dc70bb4 not found: ID does not exist" Mar 19 12:10:29.018180 master-0 kubenswrapper[29612]: I0319 12:10:29.018112 29612 scope.go:117] "RemoveContainer" containerID="246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d" Mar 19 12:10:29.018422 master-0 kubenswrapper[29612]: E0319 12:10:29.018364 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d\": container with ID starting with 246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d not found: ID does not exist" containerID="246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d" Mar 19 12:10:29.018422 master-0 kubenswrapper[29612]: I0319 12:10:29.018407 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d"} err="failed to get container status \"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d\": rpc error: code = NotFound desc = could not find container \"246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d\": container with ID starting with 246e602447d142fba954b12783d2f20b0c716af5c661aa5e586116fd136ed56d not found: ID does not exist" Mar 19 12:10:29.147438 master-0 kubenswrapper[29612]: I0319 12:10:29.147380 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:29.154518 master-0 kubenswrapper[29612]: I0319 12:10:29.154378 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:29.600392 master-0 kubenswrapper[29612]: I0319 12:10:29.600262 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:29.758721 master-0 kubenswrapper[29612]: I0319 12:10:29.758650 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:29.766576 master-0 kubenswrapper[29612]: I0319 12:10:29.766470 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7dcf5569b5-47tqf" Mar 19 12:10:30.906669 master-0 kubenswrapper[29612]: I0319 12:10:30.905893 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=6.905878498 podStartE2EDuration="6.905878498s" podCreationTimestamp="2026-03-19 12:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:10:30.905313612 +0000 UTC m=+38.219482633" watchObservedRunningTime="2026-03-19 12:10:30.905878498 +0000 UTC m=+38.220047519" Mar 19 12:10:31.735082 master-0 kubenswrapper[29612]: I0319 12:10:31.734996 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=7.734975174 podStartE2EDuration="7.734975174s" podCreationTimestamp="2026-03-19 12:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:10:31.732824623 +0000 UTC m=+39.046993664" watchObservedRunningTime="2026-03-19 12:10:31.734975174 +0000 UTC m=+39.049144195" Mar 19 12:10:33.143701 master-0 kubenswrapper[29612]: I0319 12:10:33.143654 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-66dcf44d9d-46w4r" Mar 19 12:10:33.144425 master-0 kubenswrapper[29612]: I0319 12:10:33.144398 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-5c7d7c6bb8-gnz2z" Mar 19 12:10:34.580719 master-0 kubenswrapper[29612]: I0319 12:10:34.580474 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:34.582570 master-0 kubenswrapper[29612]: I0319 12:10:34.582522 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:10:34.598842 master-0 kubenswrapper[29612]: I0319 12:10:34.598787 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 19 12:10:37.908006 master-0 kubenswrapper[29612]: I0319 12:10:37.907725 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ql8d" Mar 19 12:10:37.921673 master-0 kubenswrapper[29612]: I0319 12:10:37.921561 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4h84v" Mar 19 12:10:38.533180 master-0 kubenswrapper[29612]: I0319 12:10:38.533116 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zxlv" Mar 19 12:10:38.813674 master-0 kubenswrapper[29612]: I0319 12:10:38.813503 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tzckk" Mar 19 12:10:43.170453 master-0 kubenswrapper[29612]: I0319 12:10:43.170399 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-dc5db69db-jnd29"] Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170687 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170703 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170730 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170738 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170753 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170762 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170773 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6765b7-3576-4171-972b-20d62d81f25d" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170780 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6765b7-3576-4171-972b-20d62d81f25d" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170794 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec99218-1644-461e-abb2-14d0c5369e96" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170812 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec99218-1644-461e-abb2-14d0c5369e96" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170824 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170832 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170847 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170855 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170868 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170875 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170886 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170894 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170911 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170918 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170929 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170935 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170947 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170953 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerName="installer" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: E0319 12:10:43.170966 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.170971 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 12:10:43.171051 master-0 kubenswrapper[29612]: I0319 12:10:43.171069 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d258c9-f8b3-4233-847f-bc594a2aa4f9" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171083 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4c5b33-58c8-4b51-b71b-1befc165f2b1" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171094 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="01529a13-e1b1-40a1-9fff-8a20f1cba68c" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171103 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d8ad30-feb0-453f-b652-36aafc7b24f9" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171110 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c0e2ac-b5c3-447d-96b4-37d7d0d733ec" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171119 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec99218-1644-461e-abb2-14d0c5369e96" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171129 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171145 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="kube-apiserver-insecure-readyz" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171156 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fac1b46a11e49501805e891baae4a9" containerName="setup" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171165 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="838a469d-2c27-46db-9d6f-17bfa1c8dd09" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171173 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6765b7-3576-4171-972b-20d62d81f25d" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171184 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2452a909-6aa1-4750-9399-5945d380427a" containerName="assisted-installer-controller" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171192 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c2ed5a-4b8d-4528-a230-fe608ea55ad3" containerName="installer" Mar 19 12:10:43.171888 master-0 kubenswrapper[29612]: I0319 12:10:43.171534 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:43.174721 master-0 kubenswrapper[29612]: I0319 12:10:43.174687 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-f7v29" Mar 19 12:10:43.174885 master-0 kubenswrapper[29612]: I0319 12:10:43.174865 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 12:10:43.184165 master-0 kubenswrapper[29612]: I0319 12:10:43.184110 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-dc5db69db-jnd29"] Mar 19 12:10:43.307353 master-0 kubenswrapper[29612]: I0319 12:10:43.307312 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aac614b2-ad99-43df-845b-f5a34e9a11ea-monitoring-plugin-cert\") pod \"monitoring-plugin-dc5db69db-jnd29\" (UID: \"aac614b2-ad99-43df-845b-f5a34e9a11ea\") " pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:43.409424 master-0 kubenswrapper[29612]: I0319 12:10:43.409335 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aac614b2-ad99-43df-845b-f5a34e9a11ea-monitoring-plugin-cert\") pod \"monitoring-plugin-dc5db69db-jnd29\" (UID: \"aac614b2-ad99-43df-845b-f5a34e9a11ea\") " pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:43.410092 master-0 kubenswrapper[29612]: I0319 12:10:43.410042 29612 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 12:10:43.414687 master-0 kubenswrapper[29612]: I0319 12:10:43.414618 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/aac614b2-ad99-43df-845b-f5a34e9a11ea-monitoring-plugin-cert\") pod \"monitoring-plugin-dc5db69db-jnd29\" (UID: \"aac614b2-ad99-43df-845b-f5a34e9a11ea\") " pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:43.502741 master-0 kubenswrapper[29612]: I0319 12:10:43.502572 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:43.980581 master-0 kubenswrapper[29612]: I0319 12:10:43.980518 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-dc5db69db-jnd29"] Mar 19 12:10:43.984134 master-0 kubenswrapper[29612]: W0319 12:10:43.984080 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaac614b2_ad99_43df_845b_f5a34e9a11ea.slice/crio-90773adbfe841a9e2a9db496d9e9c02c42974fc67029ff4be605354f0ce5a37d WatchSource:0}: Error finding container 90773adbfe841a9e2a9db496d9e9c02c42974fc67029ff4be605354f0ce5a37d: Status 404 returned error can't find the container with id 90773adbfe841a9e2a9db496d9e9c02c42974fc67029ff4be605354f0ce5a37d Mar 19 12:10:43.987801 master-0 kubenswrapper[29612]: I0319 12:10:43.987763 29612 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:10:44.603381 master-0 kubenswrapper[29612]: I0319 12:10:44.603326 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:10:44.872934 master-0 kubenswrapper[29612]: I0319 12:10:44.872737 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" event={"ID":"aac614b2-ad99-43df-845b-f5a34e9a11ea","Type":"ContainerStarted","Data":"90773adbfe841a9e2a9db496d9e9c02c42974fc67029ff4be605354f0ce5a37d"} Mar 19 12:10:46.888602 master-0 kubenswrapper[29612]: I0319 12:10:46.888538 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" event={"ID":"aac614b2-ad99-43df-845b-f5a34e9a11ea","Type":"ContainerStarted","Data":"a74aa6f93df1bc218f014900d4466b13a77b1fcba93f4cf3830b17d79f396494"} Mar 19 12:10:46.889946 master-0 kubenswrapper[29612]: I0319 12:10:46.889901 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:46.911683 master-0 kubenswrapper[29612]: I0319 12:10:46.911619 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" podStartSLOduration=26.082128791 podStartE2EDuration="27.91160466s" podCreationTimestamp="2026-03-19 12:10:19 +0000 UTC" firstStartedPulling="2026-03-19 12:10:43.987681305 +0000 UTC m=+51.301850326" lastFinishedPulling="2026-03-19 12:10:45.817157144 +0000 UTC m=+53.131326195" observedRunningTime="2026-03-19 12:10:46.911426555 +0000 UTC m=+54.225595626" watchObservedRunningTime="2026-03-19 12:10:46.91160466 +0000 UTC m=+54.225773681" Mar 19 12:10:46.914497 master-0 kubenswrapper[29612]: I0319 12:10:46.914460 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-dc5db69db-jnd29" Mar 19 12:10:48.835972 master-0 kubenswrapper[29612]: I0319 12:10:48.835905 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:10:49.073272 master-0 kubenswrapper[29612]: I0319 12:10:49.073202 29612 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:10:49.073551 master-0 kubenswrapper[29612]: I0319 12:10:49.073474 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" containerID="cri-o://c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595" gracePeriod=5 Mar 19 12:10:54.669165 master-0 kubenswrapper[29612]: I0319 12:10:54.669025 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 19 12:10:54.669165 master-0 kubenswrapper[29612]: I0319 12:10:54.669121 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:54.770086 master-0 kubenswrapper[29612]: I0319 12:10:54.770034 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 12:10:54.770428 master-0 kubenswrapper[29612]: I0319 12:10:54.770169 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock" (OuterVolumeSpecName: "var-lock") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:54.770586 master-0 kubenswrapper[29612]: I0319 12:10:54.770555 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 12:10:54.770818 master-0 kubenswrapper[29612]: I0319 12:10:54.770792 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 12:10:54.771006 master-0 kubenswrapper[29612]: I0319 12:10:54.770623 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests" (OuterVolumeSpecName: "manifests") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:54.771006 master-0 kubenswrapper[29612]: I0319 12:10:54.770916 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log" (OuterVolumeSpecName: "var-log") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:54.771218 master-0 kubenswrapper[29612]: I0319 12:10:54.771195 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 12:10:54.771346 master-0 kubenswrapper[29612]: I0319 12:10:54.771329 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") pod \"8e7a82869988463543d3d8dd1f0b5fe3\" (UID: \"8e7a82869988463543d3d8dd1f0b5fe3\") " Mar 19 12:10:54.771467 master-0 kubenswrapper[29612]: I0319 12:10:54.771430 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:54.771945 master-0 kubenswrapper[29612]: I0319 12:10:54.771916 29612 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:54.772097 master-0 kubenswrapper[29612]: I0319 12:10:54.772074 29612 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:54.772208 master-0 kubenswrapper[29612]: I0319 12:10:54.772189 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:54.772328 master-0 kubenswrapper[29612]: I0319 12:10:54.772309 29612 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:54.775967 master-0 kubenswrapper[29612]: I0319 12:10:54.775894 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "8e7a82869988463543d3d8dd1f0b5fe3" (UID: "8e7a82869988463543d3d8dd1f0b5fe3"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:10:54.873477 master-0 kubenswrapper[29612]: I0319 12:10:54.873409 29612 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e7a82869988463543d3d8dd1f0b5fe3-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:10:54.919809 master-0 kubenswrapper[29612]: I0319 12:10:54.919663 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7a82869988463543d3d8dd1f0b5fe3" path="/var/lib/kubelet/pods/8e7a82869988463543d3d8dd1f0b5fe3/volumes" Mar 19 12:10:54.920132 master-0 kubenswrapper[29612]: I0319 12:10:54.920049 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 19 12:10:54.938391 master-0 kubenswrapper[29612]: I0319 12:10:54.938334 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:10:54.938391 master-0 kubenswrapper[29612]: I0319 12:10:54.938375 29612 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f013194e-2e0c-4e40-b87e-01ea64b3f12e" Mar 19 12:10:54.942601 master-0 kubenswrapper[29612]: I0319 12:10:54.942521 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:10:54.942860 master-0 kubenswrapper[29612]: I0319 12:10:54.942829 29612 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f013194e-2e0c-4e40-b87e-01ea64b3f12e" Mar 19 12:10:54.968088 master-0 kubenswrapper[29612]: I0319 12:10:54.968038 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_8e7a82869988463543d3d8dd1f0b5fe3/startup-monitor/0.log" Mar 19 12:10:54.968279 master-0 kubenswrapper[29612]: I0319 12:10:54.968094 29612 generic.go:334] "Generic (PLEG): container finished" podID="8e7a82869988463543d3d8dd1f0b5fe3" containerID="c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595" exitCode=137 Mar 19 12:10:54.968279 master-0 kubenswrapper[29612]: I0319 12:10:54.968136 29612 scope.go:117] "RemoveContainer" containerID="c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595" Mar 19 12:10:54.968279 master-0 kubenswrapper[29612]: I0319 12:10:54.968208 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:10:54.985605 master-0 kubenswrapper[29612]: I0319 12:10:54.985546 29612 scope.go:117] "RemoveContainer" containerID="c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595" Mar 19 12:10:54.986212 master-0 kubenswrapper[29612]: E0319 12:10:54.986150 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595\": container with ID starting with c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595 not found: ID does not exist" containerID="c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595" Mar 19 12:10:54.986298 master-0 kubenswrapper[29612]: I0319 12:10:54.986210 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595"} err="failed to get container status \"c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595\": rpc error: code = NotFound desc = could not find container \"c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595\": container with ID starting with c9e73de8d37f9782491df2e34ef79e893615d28f4d1392de882e6d20263ca595 not found: ID does not exist" Mar 19 12:11:00.417523 master-0 kubenswrapper[29612]: I0319 12:11:00.417470 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79589db566-gb7tq"] Mar 19 12:11:00.418517 master-0 kubenswrapper[29612]: I0319 12:11:00.418464 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" containerID="cri-o://f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf" gracePeriod=30 Mar 19 12:11:00.445071 master-0 kubenswrapper[29612]: I0319 12:11:00.445004 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt"] Mar 19 12:11:00.445356 master-0 kubenswrapper[29612]: I0319 12:11:00.445310 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" containerID="cri-o://6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72" gracePeriod=30 Mar 19 12:11:01.005785 master-0 kubenswrapper[29612]: I0319 12:11:01.005739 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:11:01.013313 master-0 kubenswrapper[29612]: I0319 12:11:01.013271 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:11:01.017695 master-0 kubenswrapper[29612]: I0319 12:11:01.017600 29612 generic.go:334] "Generic (PLEG): container finished" podID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerID="f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf" exitCode=0 Mar 19 12:11:01.017802 master-0 kubenswrapper[29612]: I0319 12:11:01.017694 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerDied","Data":"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf"} Mar 19 12:11:01.017802 master-0 kubenswrapper[29612]: I0319 12:11:01.017727 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" event={"ID":"819c8a76-019b-4124-a67f-fb5838cbec5d","Type":"ContainerDied","Data":"a297c2fb1d857d07bfeb6ac6ca57bb65e15ebd85864af6b5b40028961499f08b"} Mar 19 12:11:01.017802 master-0 kubenswrapper[29612]: I0319 12:11:01.017749 29612 scope.go:117] "RemoveContainer" containerID="f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf" Mar 19 12:11:01.017802 master-0 kubenswrapper[29612]: I0319 12:11:01.017697 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79589db566-gb7tq" Mar 19 12:11:01.020796 master-0 kubenswrapper[29612]: I0319 12:11:01.020536 29612 generic.go:334] "Generic (PLEG): container finished" podID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerID="6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72" exitCode=0 Mar 19 12:11:01.020796 master-0 kubenswrapper[29612]: I0319 12:11:01.020580 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerDied","Data":"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72"} Mar 19 12:11:01.020796 master-0 kubenswrapper[29612]: I0319 12:11:01.020607 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" Mar 19 12:11:01.020796 master-0 kubenswrapper[29612]: I0319 12:11:01.020608 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt" event={"ID":"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc","Type":"ContainerDied","Data":"9bc0a922f5672d138d9b5c38908ff9271f1747dee23c3b328889da1f8a25634f"} Mar 19 12:11:01.041583 master-0 kubenswrapper[29612]: I0319 12:11:01.041534 29612 scope.go:117] "RemoveContainer" containerID="171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464" Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.056764 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") pod \"819c8a76-019b-4124-a67f-fb5838cbec5d\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.056872 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") pod \"819c8a76-019b-4124-a67f-fb5838cbec5d\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.056954 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") pod \"819c8a76-019b-4124-a67f-fb5838cbec5d\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.057029 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") pod \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.057084 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") pod \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.057162 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") pod \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.057190 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") pod \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\" (UID: \"12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.057264 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") pod \"819c8a76-019b-4124-a67f-fb5838cbec5d\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.057324 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") pod \"819c8a76-019b-4124-a67f-fb5838cbec5d\" (UID: \"819c8a76-019b-4124-a67f-fb5838cbec5d\") " Mar 19 12:11:01.059750 master-0 kubenswrapper[29612]: I0319 12:11:01.058530 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config" (OuterVolumeSpecName: "config") pod "819c8a76-019b-4124-a67f-fb5838cbec5d" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:11:01.061228 master-0 kubenswrapper[29612]: I0319 12:11:01.060010 29612 scope.go:117] "RemoveContainer" containerID="f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf" Mar 19 12:11:01.061228 master-0 kubenswrapper[29612]: I0319 12:11:01.060575 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca" (OuterVolumeSpecName: "client-ca") pod "819c8a76-019b-4124-a67f-fb5838cbec5d" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:11:01.061228 master-0 kubenswrapper[29612]: E0319 12:11:01.060653 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf\": container with ID starting with f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf not found: ID does not exist" containerID="f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf" Mar 19 12:11:01.061228 master-0 kubenswrapper[29612]: I0319 12:11:01.060674 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "819c8a76-019b-4124-a67f-fb5838cbec5d" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:11:01.061228 master-0 kubenswrapper[29612]: I0319 12:11:01.060711 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf"} err="failed to get container status \"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf\": rpc error: code = NotFound desc = could not find container \"f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf\": container with ID starting with f99686009ae6cfcf85f2d4e9ab1594433b96a814084aaf1ceab0841dbc389ccf not found: ID does not exist" Mar 19 12:11:01.061228 master-0 kubenswrapper[29612]: I0319 12:11:01.060750 29612 scope.go:117] "RemoveContainer" containerID="171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464" Mar 19 12:11:01.061581 master-0 kubenswrapper[29612]: I0319 12:11:01.061336 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "819c8a76-019b-4124-a67f-fb5838cbec5d" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:11:01.061581 master-0 kubenswrapper[29612]: E0319 12:11:01.061343 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464\": container with ID starting with 171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464 not found: ID does not exist" containerID="171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464" Mar 19 12:11:01.061581 master-0 kubenswrapper[29612]: I0319 12:11:01.061374 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464"} err="failed to get container status \"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464\": rpc error: code = NotFound desc = could not find container \"171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464\": container with ID starting with 171481e8d2a8763042fabce01e9c35b8c017d73c4cb5487a9cf2c00e6f983464 not found: ID does not exist" Mar 19 12:11:01.061581 master-0 kubenswrapper[29612]: I0319 12:11:01.061399 29612 scope.go:117] "RemoveContainer" containerID="6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72" Mar 19 12:11:01.061581 master-0 kubenswrapper[29612]: I0319 12:11:01.061512 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca" (OuterVolumeSpecName: "client-ca") pod "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:11:01.061879 master-0 kubenswrapper[29612]: I0319 12:11:01.061795 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config" (OuterVolumeSpecName: "config") pod "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:11:01.064457 master-0 kubenswrapper[29612]: I0319 12:11:01.064405 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:11:01.065818 master-0 kubenswrapper[29612]: I0319 12:11:01.065765 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf" (OuterVolumeSpecName: "kube-api-access-nwmkf") pod "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" (UID: "12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc"). InnerVolumeSpecName "kube-api-access-nwmkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:11:01.069858 master-0 kubenswrapper[29612]: I0319 12:11:01.069817 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2" (OuterVolumeSpecName: "kube-api-access-64sr2") pod "819c8a76-019b-4124-a67f-fb5838cbec5d" (UID: "819c8a76-019b-4124-a67f-fb5838cbec5d"). InnerVolumeSpecName "kube-api-access-64sr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:11:01.095069 master-0 kubenswrapper[29612]: I0319 12:11:01.094819 29612 scope.go:117] "RemoveContainer" containerID="da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b" Mar 19 12:11:01.107584 master-0 kubenswrapper[29612]: I0319 12:11:01.107333 29612 scope.go:117] "RemoveContainer" containerID="6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72" Mar 19 12:11:01.107912 master-0 kubenswrapper[29612]: E0319 12:11:01.107766 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72\": container with ID starting with 6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72 not found: ID does not exist" containerID="6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72" Mar 19 12:11:01.107912 master-0 kubenswrapper[29612]: I0319 12:11:01.107799 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72"} err="failed to get container status \"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72\": rpc error: code = NotFound desc = could not find container \"6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72\": container with ID starting with 6357c00fd114a4257e8733d51780999251a436ebbdbe2364ec0059055e371e72 not found: ID does not exist" Mar 19 12:11:01.107912 master-0 kubenswrapper[29612]: I0319 12:11:01.107825 29612 scope.go:117] "RemoveContainer" containerID="da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b" Mar 19 12:11:01.108214 master-0 kubenswrapper[29612]: E0319 12:11:01.108047 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b\": container with ID starting with da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b not found: ID does not exist" containerID="da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b" Mar 19 12:11:01.108214 master-0 kubenswrapper[29612]: I0319 12:11:01.108071 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b"} err="failed to get container status \"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b\": rpc error: code = NotFound desc = could not find container \"da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b\": container with ID starting with da4b5490bb88d45760a06d9c63d1646d6075a48027128dc470038b387110196b not found: ID does not exist" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159058 29612 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/819c8a76-019b-4124-a67f-fb5838cbec5d-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159095 29612 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159110 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwmkf\" (UniqueName: \"kubernetes.io/projected/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-kube-api-access-nwmkf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159147 29612 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159156 29612 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159164 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159172 29612 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159181 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819c8a76-019b-4124-a67f-fb5838cbec5d-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.159212 master-0 kubenswrapper[29612]: I0319 12:11:01.159189 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64sr2\" (UniqueName: \"kubernetes.io/projected/819c8a76-019b-4124-a67f-fb5838cbec5d-kube-api-access-64sr2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:11:01.364997 master-0 kubenswrapper[29612]: I0319 12:11:01.364653 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt"] Mar 19 12:11:01.374778 master-0 kubenswrapper[29612]: I0319 12:11:01.374717 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bcb8645f8-5z6xt"] Mar 19 12:11:01.396948 master-0 kubenswrapper[29612]: I0319 12:11:01.396891 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79589db566-gb7tq"] Mar 19 12:11:01.402121 master-0 kubenswrapper[29612]: I0319 12:11:01.402085 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79589db566-gb7tq"] Mar 19 12:11:02.181308 master-0 kubenswrapper[29612]: I0319 12:11:02.181234 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c85478bc8-c6zdl"] Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: E0319 12:11:02.181556 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181572 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: E0319 12:11:02.181588 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181595 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: E0319 12:11:02.181608 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181616 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: E0319 12:11:02.181703 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181714 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181859 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7a82869988463543d3d8dd1f0b5fe3" containerName="startup-monitor" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181870 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" Mar 19 12:11:02.182205 master-0 kubenswrapper[29612]: I0319 12:11:02.181889 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" Mar 19 12:11:02.183040 master-0 kubenswrapper[29612]: I0319 12:11:02.182342 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.185039 master-0 kubenswrapper[29612]: I0319 12:11:02.185000 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-m26t4" Mar 19 12:11:02.185153 master-0 kubenswrapper[29612]: I0319 12:11:02.185017 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:11:02.185756 master-0 kubenswrapper[29612]: I0319 12:11:02.185718 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:11:02.186460 master-0 kubenswrapper[29612]: I0319 12:11:02.186420 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:11:02.186595 master-0 kubenswrapper[29612]: I0319 12:11:02.186557 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:11:02.187252 master-0 kubenswrapper[29612]: I0319 12:11:02.187070 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:11:02.200656 master-0 kubenswrapper[29612]: I0319 12:11:02.200576 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:11:02.201710 master-0 kubenswrapper[29612]: I0319 12:11:02.201659 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx"] Mar 19 12:11:02.202180 master-0 kubenswrapper[29612]: E0319 12:11:02.202144 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" Mar 19 12:11:02.202180 master-0 kubenswrapper[29612]: I0319 12:11:02.202171 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" Mar 19 12:11:02.202494 master-0 kubenswrapper[29612]: I0319 12:11:02.202466 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" containerName="route-controller-manager" Mar 19 12:11:02.202570 master-0 kubenswrapper[29612]: I0319 12:11:02.202506 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" containerName="controller-manager" Mar 19 12:11:02.204761 master-0 kubenswrapper[29612]: I0319 12:11:02.203828 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.206590 master-0 kubenswrapper[29612]: I0319 12:11:02.206259 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:11:02.206742 master-0 kubenswrapper[29612]: I0319 12:11:02.206716 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-62ztt" Mar 19 12:11:02.206804 master-0 kubenswrapper[29612]: I0319 12:11:02.206755 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:11:02.206804 master-0 kubenswrapper[29612]: I0319 12:11:02.206767 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:11:02.209505 master-0 kubenswrapper[29612]: I0319 12:11:02.209438 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:11:02.209951 master-0 kubenswrapper[29612]: I0319 12:11:02.209925 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:11:02.214233 master-0 kubenswrapper[29612]: I0319 12:11:02.211940 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c85478bc8-c6zdl"] Mar 19 12:11:02.222192 master-0 kubenswrapper[29612]: I0319 12:11:02.222133 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx"] Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274240 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-client-ca\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274369 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-proxy-ca-bundles\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274403 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-config\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274457 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc5b6d7b-797f-4d93-9198-180701b4dda5-config\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274481 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhxp\" (UniqueName: \"kubernetes.io/projected/72f145c7-6429-4c69-8520-0c33ef2ec0f6-kube-api-access-vxhxp\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274505 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwbl9\" (UniqueName: \"kubernetes.io/projected/cc5b6d7b-797f-4d93-9198-180701b4dda5-kube-api-access-kwbl9\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274549 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f145c7-6429-4c69-8520-0c33ef2ec0f6-serving-cert\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274578 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc5b6d7b-797f-4d93-9198-180701b4dda5-client-ca\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.274684 master-0 kubenswrapper[29612]: I0319 12:11:02.274603 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d7b-797f-4d93-9198-180701b4dda5-serving-cert\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.376513 master-0 kubenswrapper[29612]: I0319 12:11:02.376430 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f145c7-6429-4c69-8520-0c33ef2ec0f6-serving-cert\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.376751 master-0 kubenswrapper[29612]: I0319 12:11:02.376624 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc5b6d7b-797f-4d93-9198-180701b4dda5-client-ca\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.376751 master-0 kubenswrapper[29612]: I0319 12:11:02.376668 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d7b-797f-4d93-9198-180701b4dda5-serving-cert\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.376751 master-0 kubenswrapper[29612]: I0319 12:11:02.376698 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-client-ca\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.376751 master-0 kubenswrapper[29612]: I0319 12:11:02.376738 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-proxy-ca-bundles\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.376886 master-0 kubenswrapper[29612]: I0319 12:11:02.376764 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-config\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.376886 master-0 kubenswrapper[29612]: I0319 12:11:02.376817 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc5b6d7b-797f-4d93-9198-180701b4dda5-config\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.376886 master-0 kubenswrapper[29612]: I0319 12:11:02.376844 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhxp\" (UniqueName: \"kubernetes.io/projected/72f145c7-6429-4c69-8520-0c33ef2ec0f6-kube-api-access-vxhxp\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.376886 master-0 kubenswrapper[29612]: I0319 12:11:02.376868 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwbl9\" (UniqueName: \"kubernetes.io/projected/cc5b6d7b-797f-4d93-9198-180701b4dda5-kube-api-access-kwbl9\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.378200 master-0 kubenswrapper[29612]: I0319 12:11:02.378080 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc5b6d7b-797f-4d93-9198-180701b4dda5-client-ca\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.378506 master-0 kubenswrapper[29612]: I0319 12:11:02.378243 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-proxy-ca-bundles\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.379116 master-0 kubenswrapper[29612]: I0319 12:11:02.378973 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc5b6d7b-797f-4d93-9198-180701b4dda5-config\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.380732 master-0 kubenswrapper[29612]: I0319 12:11:02.380150 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-config\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.381535 master-0 kubenswrapper[29612]: I0319 12:11:02.381485 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f145c7-6429-4c69-8520-0c33ef2ec0f6-serving-cert\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.383088 master-0 kubenswrapper[29612]: I0319 12:11:02.383045 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f145c7-6429-4c69-8520-0c33ef2ec0f6-client-ca\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.388617 master-0 kubenswrapper[29612]: I0319 12:11:02.388547 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc5b6d7b-797f-4d93-9198-180701b4dda5-serving-cert\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.399401 master-0 kubenswrapper[29612]: I0319 12:11:02.399309 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhxp\" (UniqueName: \"kubernetes.io/projected/72f145c7-6429-4c69-8520-0c33ef2ec0f6-kube-api-access-vxhxp\") pod \"controller-manager-c85478bc8-c6zdl\" (UID: \"72f145c7-6429-4c69-8520-0c33ef2ec0f6\") " pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.405813 master-0 kubenswrapper[29612]: I0319 12:11:02.405746 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwbl9\" (UniqueName: \"kubernetes.io/projected/cc5b6d7b-797f-4d93-9198-180701b4dda5-kube-api-access-kwbl9\") pod \"route-controller-manager-dd6cf8ddb-ghlfx\" (UID: \"cc5b6d7b-797f-4d93-9198-180701b4dda5\") " pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.517456 master-0 kubenswrapper[29612]: I0319 12:11:02.517311 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:02.542702 master-0 kubenswrapper[29612]: I0319 12:11:02.542207 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:02.920063 master-0 kubenswrapper[29612]: I0319 12:11:02.919985 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc" path="/var/lib/kubelet/pods/12cbb2fe-3077-4b69-a6e1-ee5663fe7bbc/volumes" Mar 19 12:11:02.920747 master-0 kubenswrapper[29612]: I0319 12:11:02.920715 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819c8a76-019b-4124-a67f-fb5838cbec5d" path="/var/lib/kubelet/pods/819c8a76-019b-4124-a67f-fb5838cbec5d/volumes" Mar 19 12:11:02.948129 master-0 kubenswrapper[29612]: I0319 12:11:02.948085 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c85478bc8-c6zdl"] Mar 19 12:11:02.958742 master-0 kubenswrapper[29612]: W0319 12:11:02.958681 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f145c7_6429_4c69_8520_0c33ef2ec0f6.slice/crio-8791cb20d1072227349725585d2c47eebf5a4e661945b8d1f6fee8738714cdcb WatchSource:0}: Error finding container 8791cb20d1072227349725585d2c47eebf5a4e661945b8d1f6fee8738714cdcb: Status 404 returned error can't find the container with id 8791cb20d1072227349725585d2c47eebf5a4e661945b8d1f6fee8738714cdcb Mar 19 12:11:03.041367 master-0 kubenswrapper[29612]: I0319 12:11:03.041302 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" event={"ID":"72f145c7-6429-4c69-8520-0c33ef2ec0f6","Type":"ContainerStarted","Data":"8791cb20d1072227349725585d2c47eebf5a4e661945b8d1f6fee8738714cdcb"} Mar 19 12:11:03.067473 master-0 kubenswrapper[29612]: I0319 12:11:03.067408 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx"] Mar 19 12:11:03.074276 master-0 kubenswrapper[29612]: W0319 12:11:03.074242 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5b6d7b_797f_4d93_9198_180701b4dda5.slice/crio-621bea2d972cd060a360d9560eab646c277aa4fd3478572e7710c7ea084ecf19 WatchSource:0}: Error finding container 621bea2d972cd060a360d9560eab646c277aa4fd3478572e7710c7ea084ecf19: Status 404 returned error can't find the container with id 621bea2d972cd060a360d9560eab646c277aa4fd3478572e7710c7ea084ecf19 Mar 19 12:11:04.050003 master-0 kubenswrapper[29612]: I0319 12:11:04.049932 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" event={"ID":"72f145c7-6429-4c69-8520-0c33ef2ec0f6","Type":"ContainerStarted","Data":"e1e0b9a77d7fadc2b18716548fdd44c8e16a44e3ae88e75648c0cb5d10c66eb6"} Mar 19 12:11:04.050546 master-0 kubenswrapper[29612]: I0319 12:11:04.050154 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:04.051698 master-0 kubenswrapper[29612]: I0319 12:11:04.051656 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" event={"ID":"cc5b6d7b-797f-4d93-9198-180701b4dda5","Type":"ContainerStarted","Data":"e8587e01ba0a3cf49fe606573d96af140104ba52a74a077170a0bbd64a0ce5f7"} Mar 19 12:11:04.051698 master-0 kubenswrapper[29612]: I0319 12:11:04.051702 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" event={"ID":"cc5b6d7b-797f-4d93-9198-180701b4dda5","Type":"ContainerStarted","Data":"621bea2d972cd060a360d9560eab646c277aa4fd3478572e7710c7ea084ecf19"} Mar 19 12:11:04.051926 master-0 kubenswrapper[29612]: I0319 12:11:04.051894 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:04.054953 master-0 kubenswrapper[29612]: I0319 12:11:04.054911 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" Mar 19 12:11:04.057139 master-0 kubenswrapper[29612]: I0319 12:11:04.057094 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" Mar 19 12:11:04.074353 master-0 kubenswrapper[29612]: I0319 12:11:04.074216 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c85478bc8-c6zdl" podStartSLOduration=4.074182173 podStartE2EDuration="4.074182173s" podCreationTimestamp="2026-03-19 12:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:11:04.070320613 +0000 UTC m=+71.384489634" watchObservedRunningTime="2026-03-19 12:11:04.074182173 +0000 UTC m=+71.388351234" Mar 19 12:11:04.096521 master-0 kubenswrapper[29612]: I0319 12:11:04.096157 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dd6cf8ddb-ghlfx" podStartSLOduration=4.096131419 podStartE2EDuration="4.096131419s" podCreationTimestamp="2026-03-19 12:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:11:04.094146312 +0000 UTC m=+71.408315333" watchObservedRunningTime="2026-03-19 12:11:04.096131419 +0000 UTC m=+71.410300440" Mar 19 12:11:15.220985 master-0 kubenswrapper[29612]: I0319 12:11:15.219692 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:11:15.220985 master-0 kubenswrapper[29612]: I0319 12:11:15.220492 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.223253 master-0 kubenswrapper[29612]: I0319 12:11:15.223217 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:11:15.223480 master-0 kubenswrapper[29612]: I0319 12:11:15.223421 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hzxzv" Mar 19 12:11:15.237869 master-0 kubenswrapper[29612]: I0319 12:11:15.237776 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:11:15.257525 master-0 kubenswrapper[29612]: I0319 12:11:15.257461 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-var-lock\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.257756 master-0 kubenswrapper[29612]: I0319 12:11:15.257556 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.257756 master-0 kubenswrapper[29612]: I0319 12:11:15.257662 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.358588 master-0 kubenswrapper[29612]: I0319 12:11:15.358523 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-var-lock\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.358848 master-0 kubenswrapper[29612]: I0319 12:11:15.358681 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.358848 master-0 kubenswrapper[29612]: I0319 12:11:15.358730 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-var-lock\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.358960 master-0 kubenswrapper[29612]: I0319 12:11:15.358866 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.359096 master-0 kubenswrapper[29612]: I0319 12:11:15.359037 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.387072 master-0 kubenswrapper[29612]: I0319 12:11:15.386992 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:15.552251 master-0 kubenswrapper[29612]: I0319 12:11:15.552117 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:11:16.027784 master-0 kubenswrapper[29612]: I0319 12:11:16.027701 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 19 12:11:16.030413 master-0 kubenswrapper[29612]: W0319 12:11:16.030295 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod48468fd9_0fe4_4c84_bcea_ffb1fe8be7b5.slice/crio-e9dbd2c0d1762a454fe48c9cbc22a765ca7dc5ff9869dc248c785c5f88731546 WatchSource:0}: Error finding container e9dbd2c0d1762a454fe48c9cbc22a765ca7dc5ff9869dc248c785c5f88731546: Status 404 returned error can't find the container with id e9dbd2c0d1762a454fe48c9cbc22a765ca7dc5ff9869dc248c785c5f88731546 Mar 19 12:11:16.135420 master-0 kubenswrapper[29612]: I0319 12:11:16.135356 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5","Type":"ContainerStarted","Data":"e9dbd2c0d1762a454fe48c9cbc22a765ca7dc5ff9869dc248c785c5f88731546"} Mar 19 12:11:17.150767 master-0 kubenswrapper[29612]: I0319 12:11:17.150714 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5","Type":"ContainerStarted","Data":"8dab0cf1e2c1c1f036a617fac0dc7750b69b2d889fcb2d2acd9a4252534fc69f"} Mar 19 12:11:17.180830 master-0 kubenswrapper[29612]: I0319 12:11:17.178961 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.178934335 podStartE2EDuration="2.178934335s" podCreationTimestamp="2026-03-19 12:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:11:17.177109693 +0000 UTC m=+84.491278784" watchObservedRunningTime="2026-03-19 12:11:17.178934335 +0000 UTC m=+84.493103386" Mar 19 12:11:54.080697 master-0 kubenswrapper[29612]: I0319 12:11:54.080619 29612 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:11:54.081675 master-0 kubenswrapper[29612]: I0319 12:11:54.081627 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.082732 master-0 kubenswrapper[29612]: I0319 12:11:54.082697 29612 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:11:54.082998 master-0 kubenswrapper[29612]: I0319 12:11:54.082962 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" containerID="cri-o://bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd" gracePeriod=15 Mar 19 12:11:54.083198 master-0 kubenswrapper[29612]: I0319 12:11:54.083160 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07" gracePeriod=15 Mar 19 12:11:54.083264 master-0 kubenswrapper[29612]: I0319 12:11:54.083241 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467" gracePeriod=15 Mar 19 12:11:54.083312 master-0 kubenswrapper[29612]: I0319 12:11:54.083292 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a" gracePeriod=15 Mar 19 12:11:54.083360 master-0 kubenswrapper[29612]: I0319 12:11:54.083337 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071" gracePeriod=15 Mar 19 12:11:54.084342 master-0 kubenswrapper[29612]: I0319 12:11:54.083997 29612 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:11:54.084846 master-0 kubenswrapper[29612]: E0319 12:11:54.084818 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 12:11:54.084846 master-0 kubenswrapper[29612]: I0319 12:11:54.084844 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 12:11:54.084959 master-0 kubenswrapper[29612]: E0319 12:11:54.084864 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 12:11:54.084959 master-0 kubenswrapper[29612]: I0319 12:11:54.084873 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 12:11:54.084959 master-0 kubenswrapper[29612]: E0319 12:11:54.084883 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 19 12:11:54.084959 master-0 kubenswrapper[29612]: I0319 12:11:54.084891 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="setup" Mar 19 12:11:54.084959 master-0 kubenswrapper[29612]: E0319 12:11:54.084905 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:11:54.084959 master-0 kubenswrapper[29612]: I0319 12:11:54.084913 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:11:54.085188 master-0 kubenswrapper[29612]: E0319 12:11:54.084936 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:11:54.085188 master-0 kubenswrapper[29612]: I0319 12:11:54.085000 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:11:54.085188 master-0 kubenswrapper[29612]: E0319 12:11:54.085019 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:11:54.085188 master-0 kubenswrapper[29612]: I0319 12:11:54.085028 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:11:54.085188 master-0 kubenswrapper[29612]: E0319 12:11:54.085036 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 12:11:54.085188 master-0 kubenswrapper[29612]: I0319 12:11:54.085044 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 12:11:54.085448 master-0 kubenswrapper[29612]: I0319 12:11:54.085199 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:11:54.085448 master-0 kubenswrapper[29612]: I0319 12:11:54.085224 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-syncer" Mar 19 12:11:54.085448 master-0 kubenswrapper[29612]: I0319 12:11:54.085248 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-insecure-readyz" Mar 19 12:11:54.085448 master-0 kubenswrapper[29612]: I0319 12:11:54.085258 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:11:54.085448 master-0 kubenswrapper[29612]: I0319 12:11:54.085271 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" Mar 19 12:11:54.085757 master-0 kubenswrapper[29612]: I0319 12:11:54.085577 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" Mar 19 12:11:54.122778 master-0 kubenswrapper[29612]: I0319 12:11:54.122717 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.122899 master-0 kubenswrapper[29612]: I0319 12:11:54.122790 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.122899 master-0 kubenswrapper[29612]: I0319 12:11:54.122827 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.122899 master-0 kubenswrapper[29612]: I0319 12:11:54.122850 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.122899 master-0 kubenswrapper[29612]: I0319 12:11:54.122897 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.123047 master-0 kubenswrapper[29612]: I0319 12:11:54.122931 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.123047 master-0 kubenswrapper[29612]: I0319 12:11:54.122963 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.123047 master-0 kubenswrapper[29612]: I0319 12:11:54.123012 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.223653 master-0 kubenswrapper[29612]: I0319 12:11:54.223606 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.223774 master-0 kubenswrapper[29612]: I0319 12:11:54.223669 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.223834 master-0 kubenswrapper[29612]: I0319 12:11:54.223787 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.223921 master-0 kubenswrapper[29612]: I0319 12:11:54.223880 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.223966 master-0 kubenswrapper[29612]: I0319 12:11:54.223914 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224013 master-0 kubenswrapper[29612]: I0319 12:11:54.223987 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.224049 master-0 kubenswrapper[29612]: I0319 12:11:54.224029 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224165 master-0 kubenswrapper[29612]: I0319 12:11:54.224144 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224165 master-0 kubenswrapper[29612]: I0319 12:11:54.224149 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224230 master-0 kubenswrapper[29612]: I0319 12:11:54.224185 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224230 master-0 kubenswrapper[29612]: I0319 12:11:54.224197 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.224230 master-0 kubenswrapper[29612]: I0319 12:11:54.224224 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.224320 master-0 kubenswrapper[29612]: I0319 12:11:54.224241 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224320 master-0 kubenswrapper[29612]: I0319 12:11:54.224296 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:11:54.224320 master-0 kubenswrapper[29612]: I0319 12:11:54.224224 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.224410 master-0 kubenswrapper[29612]: I0319 12:11:54.224318 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:11:54.432116 master-0 kubenswrapper[29612]: I0319 12:11:54.432040 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-check-endpoints/0.log" Mar 19 12:11:54.433310 master-0 kubenswrapper[29612]: I0319 12:11:54.433269 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:11:54.434390 master-0 kubenswrapper[29612]: I0319 12:11:54.434334 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07" exitCode=0 Mar 19 12:11:54.434390 master-0 kubenswrapper[29612]: I0319 12:11:54.434374 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467" exitCode=0 Mar 19 12:11:54.434390 master-0 kubenswrapper[29612]: I0319 12:11:54.434390 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a" exitCode=0 Mar 19 12:11:54.434390 master-0 kubenswrapper[29612]: I0319 12:11:54.434406 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071" exitCode=2 Mar 19 12:11:54.434765 master-0 kubenswrapper[29612]: I0319 12:11:54.434440 29612 scope.go:117] "RemoveContainer" containerID="76d14e466053de21cd4b44b01526664874f27eff3ad2faef8933296d68c35040" Mar 19 12:11:54.593215 master-0 kubenswrapper[29612]: I0319 12:11:54.593062 29612 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" start-of-body= Mar 19 12:11:54.593215 master-0 kubenswrapper[29612]: I0319 12:11:54.593171 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: I0319 12:11:54.596480 29612 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]log ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]api-openshift-apiserver-available ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]api-openshift-oauth-apiserver-available ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]informer-sync ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-apiextensions-informers ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/crd-informer-synced ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/rbac/bootstrap-roles ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/bootstrap-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-registration-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]autoregister-completion ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: [-]shutdown failed: reason withheld Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: readyz check failed Mar 19 12:11:54.596522 master-0 kubenswrapper[29612]: I0319 12:11:54.596521 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 12:11:55.446790 master-0 kubenswrapper[29612]: I0319 12:11:55.446752 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:11:59.481834 master-0 kubenswrapper[29612]: I0319 12:11:59.481784 29612 generic.go:334] "Generic (PLEG): container finished" podID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" containerID="8dab0cf1e2c1c1f036a617fac0dc7750b69b2d889fcb2d2acd9a4252534fc69f" exitCode=0 Mar 19 12:11:59.482468 master-0 kubenswrapper[29612]: I0319 12:11:59.481843 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5","Type":"ContainerDied","Data":"8dab0cf1e2c1c1f036a617fac0dc7750b69b2d889fcb2d2acd9a4252534fc69f"} Mar 19 12:12:00.861742 master-0 kubenswrapper[29612]: I0319 12:12:00.861674 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:12:00.926958 master-0 kubenswrapper[29612]: I0319 12:12:00.926904 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kube-api-access\") pod \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " Mar 19 12:12:00.927477 master-0 kubenswrapper[29612]: I0319 12:12:00.927419 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-var-lock\") pod \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " Mar 19 12:12:00.927540 master-0 kubenswrapper[29612]: I0319 12:12:00.927500 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kubelet-dir\") pod \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\" (UID: \"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5\") " Mar 19 12:12:00.927585 master-0 kubenswrapper[29612]: I0319 12:12:00.927535 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-var-lock" (OuterVolumeSpecName: "var-lock") pod "48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" (UID: "48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:00.927769 master-0 kubenswrapper[29612]: I0319 12:12:00.927710 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" (UID: "48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:00.928413 master-0 kubenswrapper[29612]: I0319 12:12:00.928169 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:00.928413 master-0 kubenswrapper[29612]: I0319 12:12:00.928220 29612 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:00.930473 master-0 kubenswrapper[29612]: I0319 12:12:00.930428 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" (UID: "48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:12:01.030361 master-0 kubenswrapper[29612]: I0319 12:12:01.030259 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:01.497674 master-0 kubenswrapper[29612]: I0319 12:12:01.497562 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5","Type":"ContainerDied","Data":"e9dbd2c0d1762a454fe48c9cbc22a765ca7dc5ff9869dc248c785c5f88731546"} Mar 19 12:12:01.497674 master-0 kubenswrapper[29612]: I0319 12:12:01.497622 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9dbd2c0d1762a454fe48c9cbc22a765ca7dc5ff9869dc248c785c5f88731546" Mar 19 12:12:01.497974 master-0 kubenswrapper[29612]: I0319 12:12:01.497681 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 19 12:12:03.996048 master-0 kubenswrapper[29612]: E0319 12:12:03.995994 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45ea2ef1cf2bc9d1d994d6538ae0a64.slice/crio-bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:12:04.145534 master-0 kubenswrapper[29612]: I0319 12:12:04.145479 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:12:04.147227 master-0 kubenswrapper[29612]: I0319 12:12:04.147095 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:04.159835 master-0 kubenswrapper[29612]: E0319 12:12:04.159757 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:12:04.160473 master-0 kubenswrapper[29612]: I0319 12:12:04.160441 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:12:04.179076 master-0 kubenswrapper[29612]: I0319 12:12:04.179014 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 12:12:04.179537 master-0 kubenswrapper[29612]: I0319 12:12:04.179512 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 12:12:04.179762 master-0 kubenswrapper[29612]: I0319 12:12:04.179155 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:04.179831 master-0 kubenswrapper[29612]: I0319 12:12:04.179561 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:04.180085 master-0 kubenswrapper[29612]: I0319 12:12:04.180068 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") pod \"b45ea2ef1cf2bc9d1d994d6538ae0a64\" (UID: \"b45ea2ef1cf2bc9d1d994d6538ae0a64\") " Mar 19 12:12:04.180249 master-0 kubenswrapper[29612]: I0319 12:12:04.180186 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b45ea2ef1cf2bc9d1d994d6538ae0a64" (UID: "b45ea2ef1cf2bc9d1d994d6538ae0a64"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:04.180520 master-0 kubenswrapper[29612]: I0319 12:12:04.180504 29612 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:04.180592 master-0 kubenswrapper[29612]: I0319 12:12:04.180580 29612 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:04.180670 master-0 kubenswrapper[29612]: I0319 12:12:04.180660 29612 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45ea2ef1cf2bc9d1d994d6538ae0a64-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:04.195121 master-0 kubenswrapper[29612]: W0319 12:12:04.195029 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fb4ea7f83036d9c6adf3454fc7e9db.slice/crio-e2faf2d178041429171fbe7c9189607d6296f1a2fcaa9203e8f0398a1226038c WatchSource:0}: Error finding container e2faf2d178041429171fbe7c9189607d6296f1a2fcaa9203e8f0398a1226038c: Status 404 returned error can't find the container with id e2faf2d178041429171fbe7c9189607d6296f1a2fcaa9203e8f0398a1226038c Mar 19 12:12:04.489779 master-0 kubenswrapper[29612]: I0319 12:12:04.489624 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.490469 master-0 kubenswrapper[29612]: I0319 12:12:04.490399 29612 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.491083 master-0 kubenswrapper[29612]: I0319 12:12:04.491040 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.496067 master-0 kubenswrapper[29612]: I0319 12:12:04.496001 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.496711 master-0 kubenswrapper[29612]: I0319 12:12:04.496668 29612 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.528348 master-0 kubenswrapper[29612]: I0319 12:12:04.525388 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"16fb4ea7f83036d9c6adf3454fc7e9db","Type":"ContainerStarted","Data":"f5802453660f7622c84d0e0575545ad505a33d3b447324f5697422bc60903811"} Mar 19 12:12:04.528348 master-0 kubenswrapper[29612]: I0319 12:12:04.525441 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"16fb4ea7f83036d9c6adf3454fc7e9db","Type":"ContainerStarted","Data":"e2faf2d178041429171fbe7c9189607d6296f1a2fcaa9203e8f0398a1226038c"} Mar 19 12:12:04.528348 master-0 kubenswrapper[29612]: E0319 12:12:04.527225 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:12:04.528348 master-0 kubenswrapper[29612]: I0319 12:12:04.527337 29612 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.528700 master-0 kubenswrapper[29612]: I0319 12:12:04.528667 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.530082 master-0 kubenswrapper[29612]: I0319 12:12:04.530048 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b45ea2ef1cf2bc9d1d994d6538ae0a64/kube-apiserver-cert-syncer/0.log" Mar 19 12:12:04.531203 master-0 kubenswrapper[29612]: I0319 12:12:04.531170 29612 generic.go:334] "Generic (PLEG): container finished" podID="b45ea2ef1cf2bc9d1d994d6538ae0a64" containerID="bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd" exitCode=0 Mar 19 12:12:04.531274 master-0 kubenswrapper[29612]: I0319 12:12:04.531220 29612 scope.go:117] "RemoveContainer" containerID="5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07" Mar 19 12:12:04.531274 master-0 kubenswrapper[29612]: I0319 12:12:04.531256 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:04.549509 master-0 kubenswrapper[29612]: I0319 12:12:04.548056 29612 scope.go:117] "RemoveContainer" containerID="ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467" Mar 19 12:12:04.549589 master-0 kubenswrapper[29612]: I0319 12:12:04.549551 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.550371 master-0 kubenswrapper[29612]: I0319 12:12:04.550303 29612 status_manager.go:851] "Failed to get status for pod" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:04.561359 master-0 kubenswrapper[29612]: I0319 12:12:04.561309 29612 scope.go:117] "RemoveContainer" containerID="5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a" Mar 19 12:12:04.574944 master-0 kubenswrapper[29612]: I0319 12:12:04.574888 29612 scope.go:117] "RemoveContainer" containerID="f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071" Mar 19 12:12:04.597822 master-0 kubenswrapper[29612]: I0319 12:12:04.597724 29612 scope.go:117] "RemoveContainer" containerID="bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd" Mar 19 12:12:04.607318 master-0 kubenswrapper[29612]: E0319 12:12:04.606523 29612 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 19 12:12:04.607318 master-0 kubenswrapper[29612]: &Event{ObjectMeta:{kube-apiserver-master-0.189e3cf24c76a29a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b45ea2ef1cf2bc9d1d994d6538ae0a64,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.32.10:17697/healthz": dial tcp 192.168.32.10:17697: connect: connection refused Mar 19 12:12:04.607318 master-0 kubenswrapper[29612]: body: Mar 19 12:12:04.607318 master-0 kubenswrapper[29612]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:11:54.593149594 +0000 UTC m=+121.907318625,LastTimestamp:2026-03-19 12:11:54.593149594 +0000 UTC m=+121.907318625,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 12:12:04.607318 master-0 kubenswrapper[29612]: > Mar 19 12:12:04.612456 master-0 kubenswrapper[29612]: I0319 12:12:04.612404 29612 scope.go:117] "RemoveContainer" containerID="d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363" Mar 19 12:12:04.634809 master-0 kubenswrapper[29612]: I0319 12:12:04.634175 29612 scope.go:117] "RemoveContainer" containerID="5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07" Mar 19 12:12:04.634809 master-0 kubenswrapper[29612]: E0319 12:12:04.634756 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07\": container with ID starting with 5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07 not found: ID does not exist" containerID="5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07" Mar 19 12:12:04.634809 master-0 kubenswrapper[29612]: I0319 12:12:04.634793 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07"} err="failed to get container status \"5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07\": rpc error: code = NotFound desc = could not find container \"5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07\": container with ID starting with 5b8d08217eacb016c362e812462e3090c6480e11a213448cd3ed2df9c0089e07 not found: ID does not exist" Mar 19 12:12:04.635035 master-0 kubenswrapper[29612]: I0319 12:12:04.634819 29612 scope.go:117] "RemoveContainer" containerID="ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467" Mar 19 12:12:04.635311 master-0 kubenswrapper[29612]: E0319 12:12:04.635145 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467\": container with ID starting with ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467 not found: ID does not exist" containerID="ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467" Mar 19 12:12:04.635699 master-0 kubenswrapper[29612]: I0319 12:12:04.635383 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467"} err="failed to get container status \"ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467\": rpc error: code = NotFound desc = could not find container \"ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467\": container with ID starting with ebf05a5fbfe9d38e93c3d9711bd3fce60a2a0bf9693c955763a56f16125f2467 not found: ID does not exist" Mar 19 12:12:04.635773 master-0 kubenswrapper[29612]: I0319 12:12:04.635697 29612 scope.go:117] "RemoveContainer" containerID="5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a" Mar 19 12:12:04.636180 master-0 kubenswrapper[29612]: E0319 12:12:04.636125 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a\": container with ID starting with 5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a not found: ID does not exist" containerID="5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a" Mar 19 12:12:04.636243 master-0 kubenswrapper[29612]: I0319 12:12:04.636173 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a"} err="failed to get container status \"5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a\": rpc error: code = NotFound desc = could not find container \"5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a\": container with ID starting with 5e517e390edfd6317d97d411efcf7b26699fb30d318b32a3786ff39439aeb84a not found: ID does not exist" Mar 19 12:12:04.636243 master-0 kubenswrapper[29612]: I0319 12:12:04.636196 29612 scope.go:117] "RemoveContainer" containerID="f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071" Mar 19 12:12:04.636522 master-0 kubenswrapper[29612]: E0319 12:12:04.636474 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071\": container with ID starting with f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071 not found: ID does not exist" containerID="f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071" Mar 19 12:12:04.636522 master-0 kubenswrapper[29612]: I0319 12:12:04.636510 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071"} err="failed to get container status \"f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071\": rpc error: code = NotFound desc = could not find container \"f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071\": container with ID starting with f13ca089004e0423145ff8ff990ff191782794b3d6c211723d5df762af1c4071 not found: ID does not exist" Mar 19 12:12:04.636723 master-0 kubenswrapper[29612]: I0319 12:12:04.636529 29612 scope.go:117] "RemoveContainer" containerID="bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd" Mar 19 12:12:04.636893 master-0 kubenswrapper[29612]: E0319 12:12:04.636847 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd\": container with ID starting with bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd not found: ID does not exist" containerID="bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd" Mar 19 12:12:04.636893 master-0 kubenswrapper[29612]: I0319 12:12:04.636880 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd"} err="failed to get container status \"bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd\": rpc error: code = NotFound desc = could not find container \"bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd\": container with ID starting with bfb99646e04e2236126a47ee69dbe9fff0489e9eca7bf264b1a98ebb89eb2cfd not found: ID does not exist" Mar 19 12:12:04.637011 master-0 kubenswrapper[29612]: I0319 12:12:04.636898 29612 scope.go:117] "RemoveContainer" containerID="d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363" Mar 19 12:12:04.637327 master-0 kubenswrapper[29612]: E0319 12:12:04.637283 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363\": container with ID starting with d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363 not found: ID does not exist" containerID="d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363" Mar 19 12:12:04.637327 master-0 kubenswrapper[29612]: I0319 12:12:04.637312 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363"} err="failed to get container status \"d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363\": rpc error: code = NotFound desc = could not find container \"d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363\": container with ID starting with d78ec536dae79fd1b357aebe85c60b9926928ed0074fdeeba69dffd540334363 not found: ID does not exist" Mar 19 12:12:04.915619 master-0 kubenswrapper[29612]: I0319 12:12:04.915568 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45ea2ef1cf2bc9d1d994d6538ae0a64" path="/var/lib/kubelet/pods/b45ea2ef1cf2bc9d1d994d6538ae0a64/volumes" Mar 19 12:12:05.811309 master-0 kubenswrapper[29612]: E0319 12:12:05.811143 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 19 12:12:05.812276 master-0 kubenswrapper[29612]: E0319 12:12:05.812204 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:05.813077 master-0 kubenswrapper[29612]: E0319 12:12:05.813020 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:05.813788 master-0 kubenswrapper[29612]: E0319 12:12:05.813736 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:05.814649 master-0 kubenswrapper[29612]: E0319 12:12:05.814577 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:05.814725 master-0 kubenswrapper[29612]: I0319 12:12:05.814628 29612 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 12:12:05.815406 master-0 kubenswrapper[29612]: E0319 12:12:05.815352 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 12:12:06.017793 master-0 kubenswrapper[29612]: E0319 12:12:06.017705 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 12:12:06.418588 master-0 kubenswrapper[29612]: E0319 12:12:06.418499 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 12:12:07.219969 master-0 kubenswrapper[29612]: E0319 12:12:07.219860 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 12:12:07.558028 master-0 kubenswrapper[29612]: I0319 12:12:07.557900 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:12:07.559147 master-0 kubenswrapper[29612]: I0319 12:12:07.559118 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/1.log" Mar 19 12:12:07.560337 master-0 kubenswrapper[29612]: I0319 12:12:07.560308 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:12:07.561001 master-0 kubenswrapper[29612]: I0319 12:12:07.560954 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/0.log" Mar 19 12:12:07.561176 master-0 kubenswrapper[29612]: I0319 12:12:07.560999 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="dfc697cd77881315e28c4429b102a58e90448d7960e79b2437cb6fdd1f387a62" exitCode=1 Mar 19 12:12:07.561176 master-0 kubenswrapper[29612]: I0319 12:12:07.561037 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"dfc697cd77881315e28c4429b102a58e90448d7960e79b2437cb6fdd1f387a62"} Mar 19 12:12:07.561176 master-0 kubenswrapper[29612]: I0319 12:12:07.561079 29612 scope.go:117] "RemoveContainer" containerID="4d4b4168db53bd2fbee264c121d01b11e0ff09bed322f776f7a2db8af96a040f" Mar 19 12:12:07.561770 master-0 kubenswrapper[29612]: I0319 12:12:07.561717 29612 scope.go:117] "RemoveContainer" containerID="dfc697cd77881315e28c4429b102a58e90448d7960e79b2437cb6fdd1f387a62" Mar 19 12:12:07.563155 master-0 kubenswrapper[29612]: I0319 12:12:07.562667 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:07.563513 master-0 kubenswrapper[29612]: I0319 12:12:07.563459 29612 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:07.906115 master-0 kubenswrapper[29612]: I0319 12:12:07.906067 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:07.907197 master-0 kubenswrapper[29612]: I0319 12:12:07.907136 29612 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:07.907660 master-0 kubenswrapper[29612]: I0319 12:12:07.907609 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:07.925510 master-0 kubenswrapper[29612]: I0319 12:12:07.925458 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:07.925510 master-0 kubenswrapper[29612]: I0319 12:12:07.925498 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:07.926094 master-0 kubenswrapper[29612]: E0319 12:12:07.926048 29612 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:07.926545 master-0 kubenswrapper[29612]: I0319 12:12:07.926518 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:07.957716 master-0 kubenswrapper[29612]: W0319 12:12:07.957652 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5ce05b3d592e63f1f92202d52b9635.slice/crio-7e70f7d81b377551558dffcda9f976f413b5d18fa209f494bf92548ba9b18d3a WatchSource:0}: Error finding container 7e70f7d81b377551558dffcda9f976f413b5d18fa209f494bf92548ba9b18d3a: Status 404 returned error can't find the container with id 7e70f7d81b377551558dffcda9f976f413b5d18fa209f494bf92548ba9b18d3a Mar 19 12:12:08.571020 master-0 kubenswrapper[29612]: I0319 12:12:08.570868 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:12:08.572227 master-0 kubenswrapper[29612]: I0319 12:12:08.572173 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/1.log" Mar 19 12:12:08.573247 master-0 kubenswrapper[29612]: I0319 12:12:08.573205 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:12:08.573357 master-0 kubenswrapper[29612]: I0319 12:12:08.573321 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13"} Mar 19 12:12:08.575087 master-0 kubenswrapper[29612]: I0319 12:12:08.574991 29612 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:08.575930 master-0 kubenswrapper[29612]: I0319 12:12:08.575880 29612 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa" exitCode=0 Mar 19 12:12:08.575930 master-0 kubenswrapper[29612]: I0319 12:12:08.575922 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerDied","Data":"3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa"} Mar 19 12:12:08.576050 master-0 kubenswrapper[29612]: I0319 12:12:08.575944 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"7e70f7d81b377551558dffcda9f976f413b5d18fa209f494bf92548ba9b18d3a"} Mar 19 12:12:08.576172 master-0 kubenswrapper[29612]: I0319 12:12:08.576116 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:08.576233 master-0 kubenswrapper[29612]: I0319 12:12:08.576209 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:08.576233 master-0 kubenswrapper[29612]: I0319 12:12:08.576227 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:08.577042 master-0 kubenswrapper[29612]: I0319 12:12:08.576998 29612 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:08.577161 master-0 kubenswrapper[29612]: E0319 12:12:08.577070 29612 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:08.577726 master-0 kubenswrapper[29612]: I0319 12:12:08.577627 29612 status_manager.go:851] "Failed to get status for pod" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:12:08.822200 master-0 kubenswrapper[29612]: E0319 12:12:08.822033 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 12:12:09.605886 master-0 kubenswrapper[29612]: I0319 12:12:09.605780 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8"} Mar 19 12:12:09.605886 master-0 kubenswrapper[29612]: I0319 12:12:09.605855 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e"} Mar 19 12:12:09.605886 master-0 kubenswrapper[29612]: I0319 12:12:09.605867 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834"} Mar 19 12:12:10.617154 master-0 kubenswrapper[29612]: I0319 12:12:10.617103 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded"} Mar 19 12:12:10.617154 master-0 kubenswrapper[29612]: I0319 12:12:10.617149 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7d5ce05b3d592e63f1f92202d52b9635","Type":"ContainerStarted","Data":"a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3"} Mar 19 12:12:10.617862 master-0 kubenswrapper[29612]: I0319 12:12:10.617371 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:10.617862 master-0 kubenswrapper[29612]: I0319 12:12:10.617386 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:10.617862 master-0 kubenswrapper[29612]: I0319 12:12:10.617583 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:12.927698 master-0 kubenswrapper[29612]: I0319 12:12:12.927615 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:12.927698 master-0 kubenswrapper[29612]: I0319 12:12:12.927703 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:12.934829 master-0 kubenswrapper[29612]: I0319 12:12:12.934770 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:14.577217 master-0 kubenswrapper[29612]: I0319 12:12:14.577111 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:12:14.578271 master-0 kubenswrapper[29612]: I0319 12:12:14.577223 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:12:14.588297 master-0 kubenswrapper[29612]: I0319 12:12:14.588219 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:12:15.637470 master-0 kubenswrapper[29612]: I0319 12:12:15.637426 29612 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:15.711568 master-0 kubenswrapper[29612]: I0319 12:12:15.711488 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:12:16.658405 master-0 kubenswrapper[29612]: I0319 12:12:16.658315 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:16.658405 master-0 kubenswrapper[29612]: I0319 12:12:16.658363 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="eac093e0-76f6-4b95-bf0b-d0b09c39b0a3" Mar 19 12:12:16.661622 master-0 kubenswrapper[29612]: I0319 12:12:16.661570 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7d5ce05b3d592e63f1f92202d52b9635" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:12:18.825175 master-0 kubenswrapper[29612]: I0319 12:12:18.825109 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 12:12:19.141725 master-0 kubenswrapper[29612]: I0319 12:12:19.141498 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 12:12:19.178213 master-0 kubenswrapper[29612]: I0319 12:12:19.178166 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 12:12:19.255516 master-0 kubenswrapper[29612]: I0319 12:12:19.255470 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-xsv5c" Mar 19 12:12:19.331989 master-0 kubenswrapper[29612]: I0319 12:12:19.331939 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 12:12:19.331989 master-0 kubenswrapper[29612]: I0319 12:12:19.331952 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zgw2m" Mar 19 12:12:19.407124 master-0 kubenswrapper[29612]: I0319 12:12:19.407053 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:12:19.479297 master-0 kubenswrapper[29612]: I0319 12:12:19.479240 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 12:12:19.505526 master-0 kubenswrapper[29612]: I0319 12:12:19.505459 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 12:12:19.563714 master-0 kubenswrapper[29612]: I0319 12:12:19.563584 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-wn2jn" Mar 19 12:12:19.617954 master-0 kubenswrapper[29612]: I0319 12:12:19.617894 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2pvrw" Mar 19 12:12:19.631959 master-0 kubenswrapper[29612]: I0319 12:12:19.631884 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 12:12:19.635032 master-0 kubenswrapper[29612]: I0319 12:12:19.634935 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:12:19.643233 master-0 kubenswrapper[29612]: I0319 12:12:19.643176 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 12:12:19.705179 master-0 kubenswrapper[29612]: I0319 12:12:19.705031 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 12:12:19.740723 master-0 kubenswrapper[29612]: I0319 12:12:19.740625 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:12:19.856462 master-0 kubenswrapper[29612]: I0319 12:12:19.856405 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 12:12:19.901017 master-0 kubenswrapper[29612]: I0319 12:12:19.900955 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 12:12:19.948313 master-0 kubenswrapper[29612]: I0319 12:12:19.948239 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 12:12:19.964571 master-0 kubenswrapper[29612]: I0319 12:12:19.964461 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 12:12:19.964880 master-0 kubenswrapper[29612]: I0319 12:12:19.964760 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 12:12:19.972425 master-0 kubenswrapper[29612]: I0319 12:12:19.972372 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 12:12:20.001821 master-0 kubenswrapper[29612]: I0319 12:12:20.001783 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 12:12:20.029337 master-0 kubenswrapper[29612]: I0319 12:12:20.029280 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 12:12:20.055175 master-0 kubenswrapper[29612]: I0319 12:12:20.055128 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 12:12:20.096064 master-0 kubenswrapper[29612]: I0319 12:12:20.096007 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:12:20.150371 master-0 kubenswrapper[29612]: I0319 12:12:20.150275 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 12:12:20.169959 master-0 kubenswrapper[29612]: I0319 12:12:20.169920 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 12:12:20.283420 master-0 kubenswrapper[29612]: I0319 12:12:20.283267 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 12:12:20.381340 master-0 kubenswrapper[29612]: I0319 12:12:20.381242 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 12:12:20.411784 master-0 kubenswrapper[29612]: I0319 12:12:20.409694 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 12:12:20.440503 master-0 kubenswrapper[29612]: I0319 12:12:20.440437 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-f27d9" Mar 19 12:12:20.467476 master-0 kubenswrapper[29612]: I0319 12:12:20.467426 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-nghrc" Mar 19 12:12:20.486877 master-0 kubenswrapper[29612]: I0319 12:12:20.486809 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 12:12:20.518308 master-0 kubenswrapper[29612]: I0319 12:12:20.518257 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 12:12:20.586473 master-0 kubenswrapper[29612]: I0319 12:12:20.585379 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 12:12:20.591710 master-0 kubenswrapper[29612]: I0319 12:12:20.591659 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 12:12:20.629472 master-0 kubenswrapper[29612]: I0319 12:12:20.629298 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 12:12:20.650470 master-0 kubenswrapper[29612]: I0319 12:12:20.650417 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-f7v29" Mar 19 12:12:20.701627 master-0 kubenswrapper[29612]: I0319 12:12:20.701577 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 12:12:20.804628 master-0 kubenswrapper[29612]: I0319 12:12:20.804578 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 12:12:20.900657 master-0 kubenswrapper[29612]: I0319 12:12:20.900577 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 12:12:21.011651 master-0 kubenswrapper[29612]: I0319 12:12:21.011578 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-gjszs" Mar 19 12:12:21.053416 master-0 kubenswrapper[29612]: I0319 12:12:21.053350 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 12:12:21.095467 master-0 kubenswrapper[29612]: I0319 12:12:21.095414 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 12:12:21.101281 master-0 kubenswrapper[29612]: I0319 12:12:21.101243 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-k64d2" Mar 19 12:12:21.113369 master-0 kubenswrapper[29612]: I0319 12:12:21.113301 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 12:12:21.141986 master-0 kubenswrapper[29612]: I0319 12:12:21.141917 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 12:12:21.146295 master-0 kubenswrapper[29612]: I0319 12:12:21.146268 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-62ztt" Mar 19 12:12:21.158403 master-0 kubenswrapper[29612]: I0319 12:12:21.158289 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:12:21.220853 master-0 kubenswrapper[29612]: I0319 12:12:21.220794 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-56gmj" Mar 19 12:12:21.272501 master-0 kubenswrapper[29612]: I0319 12:12:21.272448 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 12:12:21.274470 master-0 kubenswrapper[29612]: I0319 12:12:21.274436 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 12:12:21.291917 master-0 kubenswrapper[29612]: I0319 12:12:21.291866 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 12:12:21.309578 master-0 kubenswrapper[29612]: I0319 12:12:21.309522 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 12:12:21.360440 master-0 kubenswrapper[29612]: I0319 12:12:21.360378 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 12:12:21.362184 master-0 kubenswrapper[29612]: I0319 12:12:21.362150 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:12:21.419722 master-0 kubenswrapper[29612]: I0319 12:12:21.419602 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 12:12:21.482280 master-0 kubenswrapper[29612]: I0319 12:12:21.482241 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 12:12:21.486422 master-0 kubenswrapper[29612]: I0319 12:12:21.486384 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 12:12:21.501654 master-0 kubenswrapper[29612]: I0319 12:12:21.501583 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:12:21.533266 master-0 kubenswrapper[29612]: I0319 12:12:21.533204 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 12:12:21.544258 master-0 kubenswrapper[29612]: I0319 12:12:21.544227 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 12:12:21.600864 master-0 kubenswrapper[29612]: I0319 12:12:21.600821 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 12:12:21.668011 master-0 kubenswrapper[29612]: I0319 12:12:21.667953 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 12:12:21.755356 master-0 kubenswrapper[29612]: I0319 12:12:21.755226 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 12:12:21.810513 master-0 kubenswrapper[29612]: I0319 12:12:21.808311 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 12:12:21.810513 master-0 kubenswrapper[29612]: I0319 12:12:21.809189 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 12:12:21.850951 master-0 kubenswrapper[29612]: I0319 12:12:21.850882 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 12:12:21.866719 master-0 kubenswrapper[29612]: I0319 12:12:21.866668 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 12:12:21.994174 master-0 kubenswrapper[29612]: I0319 12:12:21.994123 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 12:12:22.085789 master-0 kubenswrapper[29612]: I0319 12:12:22.085611 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 12:12:22.089105 master-0 kubenswrapper[29612]: I0319 12:12:22.089060 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 12:12:22.157790 master-0 kubenswrapper[29612]: I0319 12:12:22.157741 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 12:12:22.160735 master-0 kubenswrapper[29612]: I0319 12:12:22.160694 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 12:12:22.173421 master-0 kubenswrapper[29612]: I0319 12:12:22.173362 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xm92n" Mar 19 12:12:22.175871 master-0 kubenswrapper[29612]: I0319 12:12:22.175815 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 12:12:22.195899 master-0 kubenswrapper[29612]: I0319 12:12:22.195852 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 12:12:22.203020 master-0 kubenswrapper[29612]: I0319 12:12:22.202979 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 12:12:22.239878 master-0 kubenswrapper[29612]: I0319 12:12:22.239810 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-nkt4p" Mar 19 12:12:22.295653 master-0 kubenswrapper[29612]: I0319 12:12:22.295562 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:12:22.329374 master-0 kubenswrapper[29612]: I0319 12:12:22.329289 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 12:12:22.394106 master-0 kubenswrapper[29612]: I0319 12:12:22.393914 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 12:12:22.483166 master-0 kubenswrapper[29612]: I0319 12:12:22.483109 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 12:12:22.564061 master-0 kubenswrapper[29612]: I0319 12:12:22.564009 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 12:12:22.624515 master-0 kubenswrapper[29612]: I0319 12:12:22.624453 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 12:12:22.635920 master-0 kubenswrapper[29612]: I0319 12:12:22.635884 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 12:12:22.647979 master-0 kubenswrapper[29612]: I0319 12:12:22.647929 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 12:12:22.708610 master-0 kubenswrapper[29612]: I0319 12:12:22.708407 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 12:12:22.724162 master-0 kubenswrapper[29612]: I0319 12:12:22.724116 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 12:12:22.739229 master-0 kubenswrapper[29612]: I0319 12:12:22.739189 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 12:12:22.763157 master-0 kubenswrapper[29612]: I0319 12:12:22.763107 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9ww6r" Mar 19 12:12:22.776733 master-0 kubenswrapper[29612]: I0319 12:12:22.772170 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 12:12:22.778156 master-0 kubenswrapper[29612]: I0319 12:12:22.778084 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 12:12:22.780312 master-0 kubenswrapper[29612]: I0319 12:12:22.780256 29612 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 12:12:22.787742 master-0 kubenswrapper[29612]: I0319 12:12:22.786593 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:12:22.787742 master-0 kubenswrapper[29612]: I0319 12:12:22.786691 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:12:22.787976 master-0 kubenswrapper[29612]: I0319 12:12:22.787948 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 12:12:22.790576 master-0 kubenswrapper[29612]: I0319 12:12:22.790533 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:22.792856 master-0 kubenswrapper[29612]: I0319 12:12:22.792825 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:12:22.848562 master-0 kubenswrapper[29612]: I0319 12:12:22.848500 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 12:12:22.867336 master-0 kubenswrapper[29612]: I0319 12:12:22.867293 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 12:12:22.879048 master-0 kubenswrapper[29612]: I0319 12:12:22.879020 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 12:12:22.905148 master-0 kubenswrapper[29612]: I0319 12:12:22.905001 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=7.904980913 podStartE2EDuration="7.904980913s" podCreationTimestamp="2026-03-19 12:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:12:22.899821926 +0000 UTC m=+150.213990967" watchObservedRunningTime="2026-03-19 12:12:22.904980913 +0000 UTC m=+150.219149954" Mar 19 12:12:22.947056 master-0 kubenswrapper[29612]: I0319 12:12:22.947004 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 12:12:23.101756 master-0 kubenswrapper[29612]: I0319 12:12:23.101687 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 12:12:23.123086 master-0 kubenswrapper[29612]: I0319 12:12:23.123014 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 12:12:23.135812 master-0 kubenswrapper[29612]: I0319 12:12:23.135762 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 12:12:23.157555 master-0 kubenswrapper[29612]: I0319 12:12:23.157355 29612 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 12:12:23.190091 master-0 kubenswrapper[29612]: I0319 12:12:23.190024 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 12:12:23.196347 master-0 kubenswrapper[29612]: I0319 12:12:23.196317 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 12:12:23.211064 master-0 kubenswrapper[29612]: I0319 12:12:23.211016 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 12:12:23.212983 master-0 kubenswrapper[29612]: I0319 12:12:23.212955 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 12:12:23.223047 master-0 kubenswrapper[29612]: I0319 12:12:23.222990 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:12:23.233698 master-0 kubenswrapper[29612]: I0319 12:12:23.233604 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-m26t4" Mar 19 12:12:23.272237 master-0 kubenswrapper[29612]: I0319 12:12:23.272136 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 12:12:23.296512 master-0 kubenswrapper[29612]: I0319 12:12:23.296439 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 12:12:23.324708 master-0 kubenswrapper[29612]: I0319 12:12:23.324610 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 12:12:23.325343 master-0 kubenswrapper[29612]: I0319 12:12:23.325295 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 12:12:23.373521 master-0 kubenswrapper[29612]: I0319 12:12:23.373320 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 12:12:23.378214 master-0 kubenswrapper[29612]: I0319 12:12:23.378160 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 12:12:23.469767 master-0 kubenswrapper[29612]: I0319 12:12:23.469600 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:12:23.509148 master-0 kubenswrapper[29612]: I0319 12:12:23.509110 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 12:12:23.540710 master-0 kubenswrapper[29612]: I0319 12:12:23.540668 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 12:12:23.611143 master-0 kubenswrapper[29612]: I0319 12:12:23.601752 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 12:12:23.612412 master-0 kubenswrapper[29612]: I0319 12:12:23.612389 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 12:12:23.633831 master-0 kubenswrapper[29612]: I0319 12:12:23.633772 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 12:12:23.637079 master-0 kubenswrapper[29612]: I0319 12:12:23.637052 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:12:23.642595 master-0 kubenswrapper[29612]: I0319 12:12:23.641991 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 12:12:23.670021 master-0 kubenswrapper[29612]: I0319 12:12:23.669983 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 12:12:23.699159 master-0 kubenswrapper[29612]: I0319 12:12:23.699102 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 12:12:23.725336 master-0 kubenswrapper[29612]: I0319 12:12:23.725092 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 12:12:23.732593 master-0 kubenswrapper[29612]: I0319 12:12:23.732526 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 12:12:23.733705 master-0 kubenswrapper[29612]: I0319 12:12:23.733654 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 12:12:23.734881 master-0 kubenswrapper[29612]: I0319 12:12:23.734826 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 12:12:23.734996 master-0 kubenswrapper[29612]: I0319 12:12:23.734967 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 12:12:23.739997 master-0 kubenswrapper[29612]: I0319 12:12:23.739935 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-qzhdr" Mar 19 12:12:23.773863 master-0 kubenswrapper[29612]: I0319 12:12:23.773771 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 12:12:23.794549 master-0 kubenswrapper[29612]: I0319 12:12:23.794492 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:12:23.923605 master-0 kubenswrapper[29612]: I0319 12:12:23.923540 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 12:12:23.953774 master-0 kubenswrapper[29612]: I0319 12:12:23.953702 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 12:12:23.976675 master-0 kubenswrapper[29612]: I0319 12:12:23.976529 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-pshw6" Mar 19 12:12:24.029739 master-0 kubenswrapper[29612]: I0319 12:12:24.029671 29612 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 12:12:24.036606 master-0 kubenswrapper[29612]: I0319 12:12:24.036559 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 12:12:24.088719 master-0 kubenswrapper[29612]: I0319 12:12:24.088607 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 12:12:24.195247 master-0 kubenswrapper[29612]: I0319 12:12:24.194807 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 12:12:24.211259 master-0 kubenswrapper[29612]: I0319 12:12:24.211171 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 12:12:24.314823 master-0 kubenswrapper[29612]: I0319 12:12:24.314723 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 12:12:24.355487 master-0 kubenswrapper[29612]: I0319 12:12:24.355446 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 12:12:24.422493 master-0 kubenswrapper[29612]: I0319 12:12:24.422311 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 12:12:24.428142 master-0 kubenswrapper[29612]: I0319 12:12:24.428096 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 12:12:24.520059 master-0 kubenswrapper[29612]: I0319 12:12:24.519993 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 12:12:24.539275 master-0 kubenswrapper[29612]: I0319 12:12:24.539224 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 12:12:24.585205 master-0 kubenswrapper[29612]: I0319 12:12:24.585006 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:12:24.621286 master-0 kubenswrapper[29612]: I0319 12:12:24.621083 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 12:12:24.621657 master-0 kubenswrapper[29612]: I0319 12:12:24.621570 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 12:12:24.659757 master-0 kubenswrapper[29612]: I0319 12:12:24.658601 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 12:12:24.667339 master-0 kubenswrapper[29612]: I0319 12:12:24.667287 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 12:12:24.694013 master-0 kubenswrapper[29612]: I0319 12:12:24.693942 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7v2tf" Mar 19 12:12:24.696575 master-0 kubenswrapper[29612]: I0319 12:12:24.696522 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 12:12:24.733510 master-0 kubenswrapper[29612]: I0319 12:12:24.733473 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 12:12:24.767174 master-0 kubenswrapper[29612]: I0319 12:12:24.767139 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 12:12:24.873413 master-0 kubenswrapper[29612]: I0319 12:12:24.873155 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 12:12:24.963437 master-0 kubenswrapper[29612]: I0319 12:12:24.963384 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 12:12:24.996251 master-0 kubenswrapper[29612]: I0319 12:12:24.996212 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 12:12:25.007831 master-0 kubenswrapper[29612]: I0319 12:12:25.007787 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 12:12:25.023424 master-0 kubenswrapper[29612]: I0319 12:12:25.023402 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 12:12:25.058486 master-0 kubenswrapper[29612]: I0319 12:12:25.058386 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-kl9xx" Mar 19 12:12:25.097288 master-0 kubenswrapper[29612]: I0319 12:12:25.097183 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 12:12:25.181426 master-0 kubenswrapper[29612]: I0319 12:12:25.181373 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 12:12:25.260923 master-0 kubenswrapper[29612]: I0319 12:12:25.260842 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 12:12:25.281499 master-0 kubenswrapper[29612]: I0319 12:12:25.281463 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-g78hg" Mar 19 12:12:25.317273 master-0 kubenswrapper[29612]: I0319 12:12:25.317235 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 12:12:25.319584 master-0 kubenswrapper[29612]: I0319 12:12:25.319550 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 12:12:25.344283 master-0 kubenswrapper[29612]: I0319 12:12:25.344203 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 12:12:25.379485 master-0 kubenswrapper[29612]: I0319 12:12:25.379429 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 12:12:25.402380 master-0 kubenswrapper[29612]: I0319 12:12:25.402345 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 12:12:25.411255 master-0 kubenswrapper[29612]: I0319 12:12:25.411220 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 12:12:25.456543 master-0 kubenswrapper[29612]: I0319 12:12:25.456414 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 12:12:25.482946 master-0 kubenswrapper[29612]: I0319 12:12:25.482754 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 12:12:25.525865 master-0 kubenswrapper[29612]: I0319 12:12:25.525799 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4lzgv" Mar 19 12:12:25.564246 master-0 kubenswrapper[29612]: I0319 12:12:25.564168 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 12:12:25.567920 master-0 kubenswrapper[29612]: I0319 12:12:25.567870 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 12:12:25.594012 master-0 kubenswrapper[29612]: I0319 12:12:25.591713 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 12:12:25.621088 master-0 kubenswrapper[29612]: I0319 12:12:25.621017 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 12:12:25.678427 master-0 kubenswrapper[29612]: I0319 12:12:25.678380 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 12:12:25.709921 master-0 kubenswrapper[29612]: I0319 12:12:25.709621 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 12:12:25.790707 master-0 kubenswrapper[29612]: I0319 12:12:25.790585 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 12:12:25.872885 master-0 kubenswrapper[29612]: I0319 12:12:25.872819 29612 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 12:12:25.914739 master-0 kubenswrapper[29612]: I0319 12:12:25.914655 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 12:12:25.948764 master-0 kubenswrapper[29612]: I0319 12:12:25.948682 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 12:12:25.950815 master-0 kubenswrapper[29612]: I0319 12:12:25.950759 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:12:25.982381 master-0 kubenswrapper[29612]: I0319 12:12:25.982266 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 12:12:26.021399 master-0 kubenswrapper[29612]: I0319 12:12:26.021350 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 12:12:26.043454 master-0 kubenswrapper[29612]: I0319 12:12:26.043375 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-d7x5g" Mar 19 12:12:26.097489 master-0 kubenswrapper[29612]: I0319 12:12:26.097401 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 12:12:26.144250 master-0 kubenswrapper[29612]: I0319 12:12:26.144015 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 12:12:26.147801 master-0 kubenswrapper[29612]: I0319 12:12:26.147704 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 12:12:26.151725 master-0 kubenswrapper[29612]: I0319 12:12:26.151619 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 12:12:26.152937 master-0 kubenswrapper[29612]: I0319 12:12:26.152895 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 12:12:26.197217 master-0 kubenswrapper[29612]: I0319 12:12:26.197142 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 12:12:26.199548 master-0 kubenswrapper[29612]: I0319 12:12:26.198467 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 12:12:26.209441 master-0 kubenswrapper[29612]: I0319 12:12:26.209377 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-tsgvs" Mar 19 12:12:26.261438 master-0 kubenswrapper[29612]: I0319 12:12:26.261292 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 12:12:26.301507 master-0 kubenswrapper[29612]: I0319 12:12:26.301453 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 12:12:26.349543 master-0 kubenswrapper[29612]: I0319 12:12:26.349491 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 12:12:26.482247 master-0 kubenswrapper[29612]: I0319 12:12:26.482174 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2dl2n" Mar 19 12:12:26.507064 master-0 kubenswrapper[29612]: I0319 12:12:26.507012 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 12:12:26.513023 master-0 kubenswrapper[29612]: I0319 12:12:26.512904 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bgrvv" Mar 19 12:12:26.601351 master-0 kubenswrapper[29612]: I0319 12:12:26.601298 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 12:12:26.603430 master-0 kubenswrapper[29612]: I0319 12:12:26.603362 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 12:12:26.627750 master-0 kubenswrapper[29612]: I0319 12:12:26.627719 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:12:26.749398 master-0 kubenswrapper[29612]: I0319 12:12:26.749282 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 12:12:26.822547 master-0 kubenswrapper[29612]: I0319 12:12:26.822390 29612 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 12:12:26.918919 master-0 kubenswrapper[29612]: I0319 12:12:26.918839 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-8zr85" Mar 19 12:12:26.942013 master-0 kubenswrapper[29612]: I0319 12:12:26.941937 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 12:12:26.957024 master-0 kubenswrapper[29612]: I0319 12:12:26.956965 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 12:12:26.992862 master-0 kubenswrapper[29612]: I0319 12:12:26.992818 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 12:12:27.011737 master-0 kubenswrapper[29612]: I0319 12:12:27.011677 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 12:12:27.048394 master-0 kubenswrapper[29612]: I0319 12:12:27.048270 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 12:12:27.095113 master-0 kubenswrapper[29612]: I0319 12:12:27.094999 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:12:27.174339 master-0 kubenswrapper[29612]: I0319 12:12:27.174295 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:12:27.196187 master-0 kubenswrapper[29612]: I0319 12:12:27.196118 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 12:12:27.259338 master-0 kubenswrapper[29612]: I0319 12:12:27.243087 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 12:12:27.353210 master-0 kubenswrapper[29612]: I0319 12:12:27.353059 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 12:12:27.389212 master-0 kubenswrapper[29612]: I0319 12:12:27.389127 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 12:12:27.486147 master-0 kubenswrapper[29612]: I0319 12:12:27.486058 29612 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:12:27.488062 master-0 kubenswrapper[29612]: I0319 12:12:27.486505 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" containerID="cri-o://f5802453660f7622c84d0e0575545ad505a33d3b447324f5697422bc60903811" gracePeriod=5 Mar 19 12:12:27.603689 master-0 kubenswrapper[29612]: I0319 12:12:27.603505 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 12:12:27.603946 master-0 kubenswrapper[29612]: I0319 12:12:27.603838 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 12:12:27.626406 master-0 kubenswrapper[29612]: I0319 12:12:27.626344 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 12:12:27.663573 master-0 kubenswrapper[29612]: I0319 12:12:27.663518 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-atris4eemsst" Mar 19 12:12:27.696496 master-0 kubenswrapper[29612]: I0319 12:12:27.696452 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 12:12:27.715827 master-0 kubenswrapper[29612]: I0319 12:12:27.715726 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 12:12:27.872026 master-0 kubenswrapper[29612]: I0319 12:12:27.871881 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:12:27.887764 master-0 kubenswrapper[29612]: I0319 12:12:27.887726 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 12:12:27.913396 master-0 kubenswrapper[29612]: I0319 12:12:27.913319 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 12:12:27.918347 master-0 kubenswrapper[29612]: I0319 12:12:27.915884 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 12:12:27.936898 master-0 kubenswrapper[29612]: I0319 12:12:27.936829 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 12:12:28.042081 master-0 kubenswrapper[29612]: I0319 12:12:28.042008 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 12:12:28.077171 master-0 kubenswrapper[29612]: I0319 12:12:28.077088 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 12:12:28.107389 master-0 kubenswrapper[29612]: I0319 12:12:28.107335 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 12:12:28.177249 master-0 kubenswrapper[29612]: I0319 12:12:28.177169 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 12:12:28.279744 master-0 kubenswrapper[29612]: I0319 12:12:28.279666 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 12:12:28.334743 master-0 kubenswrapper[29612]: I0319 12:12:28.334663 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 12:12:28.375874 master-0 kubenswrapper[29612]: I0319 12:12:28.375789 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 12:12:28.380242 master-0 kubenswrapper[29612]: I0319 12:12:28.380181 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 12:12:28.385113 master-0 kubenswrapper[29612]: I0319 12:12:28.385062 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 12:12:28.429458 master-0 kubenswrapper[29612]: I0319 12:12:28.429286 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 12:12:28.528130 master-0 kubenswrapper[29612]: I0319 12:12:28.528064 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-h5fcf" Mar 19 12:12:28.528319 master-0 kubenswrapper[29612]: I0319 12:12:28.528073 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:12:28.662377 master-0 kubenswrapper[29612]: I0319 12:12:28.662285 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 12:12:28.720913 master-0 kubenswrapper[29612]: I0319 12:12:28.720488 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 12:12:28.752291 master-0 kubenswrapper[29612]: I0319 12:12:28.752232 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-v6j8n" Mar 19 12:12:28.799510 master-0 kubenswrapper[29612]: I0319 12:12:28.796352 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 12:12:28.810619 master-0 kubenswrapper[29612]: I0319 12:12:28.810575 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 12:12:28.938284 master-0 kubenswrapper[29612]: I0319 12:12:28.938181 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 12:12:28.969019 master-0 kubenswrapper[29612]: I0319 12:12:28.968944 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 12:12:28.995942 master-0 kubenswrapper[29612]: I0319 12:12:28.995757 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 12:12:29.189844 master-0 kubenswrapper[29612]: I0319 12:12:29.189745 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 12:12:29.319434 master-0 kubenswrapper[29612]: I0319 12:12:29.319321 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 12:12:29.338748 master-0 kubenswrapper[29612]: I0319 12:12:29.338512 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 12:12:32.782143 master-0 kubenswrapper[29612]: I0319 12:12:32.781994 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 19 12:12:32.782143 master-0 kubenswrapper[29612]: I0319 12:12:32.782069 29612 generic.go:334] "Generic (PLEG): container finished" podID="16fb4ea7f83036d9c6adf3454fc7e9db" containerID="f5802453660f7622c84d0e0575545ad505a33d3b447324f5697422bc60903811" exitCode=137 Mar 19 12:12:32.828432 master-0 kubenswrapper[29612]: I0319 12:12:32.828366 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-w9xx8"] Mar 19 12:12:32.828701 master-0 kubenswrapper[29612]: E0319 12:12:32.828681 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" containerName="installer" Mar 19 12:12:32.828701 master-0 kubenswrapper[29612]: I0319 12:12:32.828699 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" containerName="installer" Mar 19 12:12:32.828808 master-0 kubenswrapper[29612]: E0319 12:12:32.828719 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 19 12:12:32.828808 master-0 kubenswrapper[29612]: I0319 12:12:32.828728 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 19 12:12:32.828931 master-0 kubenswrapper[29612]: I0319 12:12:32.828893 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="48468fd9-0fe4-4c84-bcea-ffb1fe8be7b5" containerName="installer" Mar 19 12:12:32.828931 master-0 kubenswrapper[29612]: I0319 12:12:32.828922 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" containerName="startup-monitor" Mar 19 12:12:32.829393 master-0 kubenswrapper[29612]: I0319 12:12:32.829356 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:32.832286 master-0 kubenswrapper[29612]: I0319 12:12:32.832236 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 12:12:32.833035 master-0 kubenswrapper[29612]: I0319 12:12:32.832899 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 12:12:32.833035 master-0 kubenswrapper[29612]: I0319 12:12:32.832922 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 12:12:32.833305 master-0 kubenswrapper[29612]: I0319 12:12:32.833277 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 12:12:32.834287 master-0 kubenswrapper[29612]: I0319 12:12:32.834128 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-6p6sh" Mar 19 12:12:32.839722 master-0 kubenswrapper[29612]: I0319 12:12:32.839622 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 12:12:32.902521 master-0 kubenswrapper[29612]: I0319 12:12:32.902468 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr"] Mar 19 12:12:32.903511 master-0 kubenswrapper[29612]: I0319 12:12:32.903491 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:32.905986 master-0 kubenswrapper[29612]: I0319 12:12:32.905948 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 12:12:32.921338 master-0 kubenswrapper[29612]: I0319 12:12:32.921280 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 12:12:32.924992 master-0 kubenswrapper[29612]: I0319 12:12:32.924163 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-w9xx8"] Mar 19 12:12:32.935277 master-0 kubenswrapper[29612]: I0319 12:12:32.935193 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54007dc4-db54-457c-b83c-446d3fcd4e03-trusted-ca\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:32.935277 master-0 kubenswrapper[29612]: I0319 12:12:32.935278 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54007dc4-db54-457c-b83c-446d3fcd4e03-serving-cert\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:32.935559 master-0 kubenswrapper[29612]: I0319 12:12:32.935305 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54007dc4-db54-457c-b83c-446d3fcd4e03-config\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:32.935559 master-0 kubenswrapper[29612]: I0319 12:12:32.935462 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwsbw\" (UniqueName: \"kubernetes.io/projected/54007dc4-db54-457c-b83c-446d3fcd4e03-kube-api-access-qwsbw\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:32.987395 master-0 kubenswrapper[29612]: I0319 12:12:32.986945 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr"] Mar 19 12:12:33.037307 master-0 kubenswrapper[29612]: I0319 12:12:33.037168 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9583c25b-80eb-401a-981a-dfad3a4d6bae-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-xjbxr\" (UID: \"9583c25b-80eb-401a-981a-dfad3a4d6bae\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.037307 master-0 kubenswrapper[29612]: I0319 12:12:33.037266 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54007dc4-db54-457c-b83c-446d3fcd4e03-trusted-ca\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.037562 master-0 kubenswrapper[29612]: I0319 12:12:33.037436 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54007dc4-db54-457c-b83c-446d3fcd4e03-serving-cert\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.037562 master-0 kubenswrapper[29612]: I0319 12:12:33.037475 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54007dc4-db54-457c-b83c-446d3fcd4e03-config\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.037562 master-0 kubenswrapper[29612]: I0319 12:12:33.037505 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwsbw\" (UniqueName: \"kubernetes.io/projected/54007dc4-db54-457c-b83c-446d3fcd4e03-kube-api-access-qwsbw\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.037735 master-0 kubenswrapper[29612]: I0319 12:12:33.037577 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9583c25b-80eb-401a-981a-dfad3a4d6bae-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-xjbxr\" (UID: \"9583c25b-80eb-401a-981a-dfad3a4d6bae\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.038902 master-0 kubenswrapper[29612]: I0319 12:12:33.038867 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54007dc4-db54-457c-b83c-446d3fcd4e03-config\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.038999 master-0 kubenswrapper[29612]: I0319 12:12:33.038946 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54007dc4-db54-457c-b83c-446d3fcd4e03-trusted-ca\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.040504 master-0 kubenswrapper[29612]: I0319 12:12:33.040459 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54007dc4-db54-457c-b83c-446d3fcd4e03-serving-cert\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.133081 master-0 kubenswrapper[29612]: I0319 12:12:33.132978 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 19 12:12:33.133310 master-0 kubenswrapper[29612]: I0319 12:12:33.133115 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:12:33.138905 master-0 kubenswrapper[29612]: I0319 12:12:33.138832 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9583c25b-80eb-401a-981a-dfad3a4d6bae-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-xjbxr\" (UID: \"9583c25b-80eb-401a-981a-dfad3a4d6bae\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.139006 master-0 kubenswrapper[29612]: I0319 12:12:33.138946 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9583c25b-80eb-401a-981a-dfad3a4d6bae-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-xjbxr\" (UID: \"9583c25b-80eb-401a-981a-dfad3a4d6bae\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.144024 master-0 kubenswrapper[29612]: I0319 12:12:33.143916 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9583c25b-80eb-401a-981a-dfad3a4d6bae-nginx-conf\") pod \"networking-console-plugin-7c6b76c555-xjbxr\" (UID: \"9583c25b-80eb-401a-981a-dfad3a4d6bae\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.144143 master-0 kubenswrapper[29612]: I0319 12:12:33.144031 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9583c25b-80eb-401a-981a-dfad3a4d6bae-networking-console-plugin-cert\") pod \"networking-console-plugin-7c6b76c555-xjbxr\" (UID: \"9583c25b-80eb-401a-981a-dfad3a4d6bae\") " pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.222859 master-0 kubenswrapper[29612]: I0319 12:12:33.222793 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwsbw\" (UniqueName: \"kubernetes.io/projected/54007dc4-db54-457c-b83c-446d3fcd4e03-kube-api-access-qwsbw\") pod \"console-operator-76b6568d85-w9xx8\" (UID: \"54007dc4-db54-457c-b83c-446d3fcd4e03\") " pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.228505 master-0 kubenswrapper[29612]: I0319 12:12:33.228458 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" Mar 19 12:12:33.239703 master-0 kubenswrapper[29612]: I0319 12:12:33.239657 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 12:12:33.239788 master-0 kubenswrapper[29612]: I0319 12:12:33.239763 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log" (OuterVolumeSpecName: "var-log") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:33.239788 master-0 kubenswrapper[29612]: I0319 12:12:33.239777 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 12:12:33.239871 master-0 kubenswrapper[29612]: I0319 12:12:33.239850 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 12:12:33.239909 master-0 kubenswrapper[29612]: I0319 12:12:33.239883 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 12:12:33.239951 master-0 kubenswrapper[29612]: I0319 12:12:33.239934 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") pod \"16fb4ea7f83036d9c6adf3454fc7e9db\" (UID: \"16fb4ea7f83036d9c6adf3454fc7e9db\") " Mar 19 12:12:33.240123 master-0 kubenswrapper[29612]: I0319 12:12:33.240082 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests" (OuterVolumeSpecName: "manifests") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:33.240235 master-0 kubenswrapper[29612]: I0319 12:12:33.240203 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:33.240235 master-0 kubenswrapper[29612]: I0319 12:12:33.240085 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock" (OuterVolumeSpecName: "var-lock") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:33.240235 master-0 kubenswrapper[29612]: I0319 12:12:33.240180 29612 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:33.244686 master-0 kubenswrapper[29612]: I0319 12:12:33.244627 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "16fb4ea7f83036d9c6adf3454fc7e9db" (UID: "16fb4ea7f83036d9c6adf3454fc7e9db"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:12:33.350154 master-0 kubenswrapper[29612]: I0319 12:12:33.350109 29612 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:33.350154 master-0 kubenswrapper[29612]: I0319 12:12:33.350152 29612 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:33.350154 master-0 kubenswrapper[29612]: I0319 12:12:33.350163 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:33.350396 master-0 kubenswrapper[29612]: I0319 12:12:33.350172 29612 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/16fb4ea7f83036d9c6adf3454fc7e9db-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:12:33.451116 master-0 kubenswrapper[29612]: I0319 12:12:33.451053 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:33.788966 master-0 kubenswrapper[29612]: I0319 12:12:33.788896 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_16fb4ea7f83036d9c6adf3454fc7e9db/startup-monitor/0.log" Mar 19 12:12:33.789492 master-0 kubenswrapper[29612]: I0319 12:12:33.789024 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:12:33.789492 master-0 kubenswrapper[29612]: I0319 12:12:33.789129 29612 scope.go:117] "RemoveContainer" containerID="f5802453660f7622c84d0e0575545ad505a33d3b447324f5697422bc60903811" Mar 19 12:12:33.822290 master-0 kubenswrapper[29612]: I0319 12:12:33.822186 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr"] Mar 19 12:12:33.857491 master-0 kubenswrapper[29612]: E0319 12:12:33.857421 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fb4ea7f83036d9c6adf3454fc7e9db.slice/crio-e2faf2d178041429171fbe7c9189607d6296f1a2fcaa9203e8f0398a1226038c\": RecentStats: unable to find data in memory cache]" Mar 19 12:12:34.189151 master-0 kubenswrapper[29612]: I0319 12:12:34.189073 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b6568d85-w9xx8"] Mar 19 12:12:34.244620 master-0 kubenswrapper[29612]: W0319 12:12:34.244551 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54007dc4_db54_457c_b83c_446d3fcd4e03.slice/crio-d9dedbfdfd00df5f9324b01b53de91c0e06389df3f318fe2237a5a13f922bc52 WatchSource:0}: Error finding container d9dedbfdfd00df5f9324b01b53de91c0e06389df3f318fe2237a5a13f922bc52: Status 404 returned error can't find the container with id d9dedbfdfd00df5f9324b01b53de91c0e06389df3f318fe2237a5a13f922bc52 Mar 19 12:12:34.806917 master-0 kubenswrapper[29612]: I0319 12:12:34.806692 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" event={"ID":"54007dc4-db54-457c-b83c-446d3fcd4e03","Type":"ContainerStarted","Data":"d9dedbfdfd00df5f9324b01b53de91c0e06389df3f318fe2237a5a13f922bc52"} Mar 19 12:12:34.810371 master-0 kubenswrapper[29612]: I0319 12:12:34.810322 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" event={"ID":"9583c25b-80eb-401a-981a-dfad3a4d6bae","Type":"ContainerStarted","Data":"9775ac6bd5546c1e403f7e6730925825a1f8e49ea21d15b8a2240f9cde02d053"} Mar 19 12:12:34.917214 master-0 kubenswrapper[29612]: I0319 12:12:34.917176 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fb4ea7f83036d9c6adf3454fc7e9db" path="/var/lib/kubelet/pods/16fb4ea7f83036d9c6adf3454fc7e9db/volumes" Mar 19 12:12:36.862672 master-0 kubenswrapper[29612]: I0319 12:12:36.861798 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" event={"ID":"9583c25b-80eb-401a-981a-dfad3a4d6bae","Type":"ContainerStarted","Data":"777b115970b1529977ea92657d8bb217fd90d7b16b05955316c8a17b7e158858"} Mar 19 12:12:36.906581 master-0 kubenswrapper[29612]: I0319 12:12:36.901284 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c6b76c555-xjbxr" podStartSLOduration=3.06686283 podStartE2EDuration="4.901260274s" podCreationTimestamp="2026-03-19 12:12:32 +0000 UTC" firstStartedPulling="2026-03-19 12:12:33.849343136 +0000 UTC m=+161.163512157" lastFinishedPulling="2026-03-19 12:12:35.68374057 +0000 UTC m=+162.997909601" observedRunningTime="2026-03-19 12:12:36.900584865 +0000 UTC m=+164.214753886" watchObservedRunningTime="2026-03-19 12:12:36.901260274 +0000 UTC m=+164.215429295" Mar 19 12:12:37.895318 master-0 kubenswrapper[29612]: I0319 12:12:37.894937 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" event={"ID":"54007dc4-db54-457c-b83c-446d3fcd4e03","Type":"ContainerStarted","Data":"4b29ac896ddace41c95e66d1475f3ea875b07ec4c2eb632e0783b94d49bd5c6d"} Mar 19 12:12:37.923204 master-0 kubenswrapper[29612]: I0319 12:12:37.923118 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" podStartSLOduration=2.760072434 podStartE2EDuration="5.92309463s" podCreationTimestamp="2026-03-19 12:12:32 +0000 UTC" firstStartedPulling="2026-03-19 12:12:34.247094667 +0000 UTC m=+161.561263698" lastFinishedPulling="2026-03-19 12:12:37.410116873 +0000 UTC m=+164.724285894" observedRunningTime="2026-03-19 12:12:37.913701142 +0000 UTC m=+165.227870173" watchObservedRunningTime="2026-03-19 12:12:37.92309463 +0000 UTC m=+165.237263661" Mar 19 12:12:38.243299 master-0 kubenswrapper[29612]: I0319 12:12:38.243226 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-66b8ffb895-9fpsq"] Mar 19 12:12:38.244289 master-0 kubenswrapper[29612]: I0319 12:12:38.244258 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:12:38.246474 master-0 kubenswrapper[29612]: I0319 12:12:38.246432 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 12:12:38.246579 master-0 kubenswrapper[29612]: I0319 12:12:38.246491 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-825dz" Mar 19 12:12:38.246757 master-0 kubenswrapper[29612]: I0319 12:12:38.246718 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 12:12:38.266784 master-0 kubenswrapper[29612]: I0319 12:12:38.266707 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-9fpsq"] Mar 19 12:12:38.325959 master-0 kubenswrapper[29612]: I0319 12:12:38.325874 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlh6\" (UniqueName: \"kubernetes.io/projected/957aff58-940f-442b-8682-f8583562300a-kube-api-access-sjlh6\") pod \"downloads-66b8ffb895-9fpsq\" (UID: \"957aff58-940f-442b-8682-f8583562300a\") " pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:12:38.427770 master-0 kubenswrapper[29612]: I0319 12:12:38.427670 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlh6\" (UniqueName: \"kubernetes.io/projected/957aff58-940f-442b-8682-f8583562300a-kube-api-access-sjlh6\") pod \"downloads-66b8ffb895-9fpsq\" (UID: \"957aff58-940f-442b-8682-f8583562300a\") " pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:12:38.459844 master-0 kubenswrapper[29612]: I0319 12:12:38.459758 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlh6\" (UniqueName: \"kubernetes.io/projected/957aff58-940f-442b-8682-f8583562300a-kube-api-access-sjlh6\") pod \"downloads-66b8ffb895-9fpsq\" (UID: \"957aff58-940f-442b-8682-f8583562300a\") " pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:12:38.567214 master-0 kubenswrapper[29612]: I0319 12:12:38.567036 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:12:38.900742 master-0 kubenswrapper[29612]: I0319 12:12:38.900674 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:38.905527 master-0 kubenswrapper[29612]: I0319 12:12:38.905488 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b6568d85-w9xx8" Mar 19 12:12:38.975189 master-0 kubenswrapper[29612]: I0319 12:12:38.975125 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-66b8ffb895-9fpsq"] Mar 19 12:12:38.980688 master-0 kubenswrapper[29612]: W0319 12:12:38.980590 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957aff58_940f_442b_8682_f8583562300a.slice/crio-44e855cfca99863b909389e90d2691149eb8ac359b5b4be59cd6e922c2d821f1 WatchSource:0}: Error finding container 44e855cfca99863b909389e90d2691149eb8ac359b5b4be59cd6e922c2d821f1: Status 404 returned error can't find the container with id 44e855cfca99863b909389e90d2691149eb8ac359b5b4be59cd6e922c2d821f1 Mar 19 12:12:39.911849 master-0 kubenswrapper[29612]: I0319 12:12:39.911773 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-9fpsq" event={"ID":"957aff58-940f-442b-8682-f8583562300a","Type":"ContainerStarted","Data":"44e855cfca99863b909389e90d2691149eb8ac359b5b4be59cd6e922c2d821f1"} Mar 19 12:12:41.313336 master-0 kubenswrapper[29612]: I0319 12:12:41.310814 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c867fd8bf-nt5cs"] Mar 19 12:12:41.313336 master-0 kubenswrapper[29612]: I0319 12:12:41.311900 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.315099 master-0 kubenswrapper[29612]: I0319 12:12:41.315047 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 12:12:41.315693 master-0 kubenswrapper[29612]: I0319 12:12:41.315203 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 12:12:41.315693 master-0 kubenswrapper[29612]: I0319 12:12:41.315270 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 12:12:41.315693 master-0 kubenswrapper[29612]: I0319 12:12:41.315387 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 12:12:41.315693 master-0 kubenswrapper[29612]: I0319 12:12:41.315529 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-mln6s" Mar 19 12:12:41.315921 master-0 kubenswrapper[29612]: I0319 12:12:41.315715 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 12:12:41.334795 master-0 kubenswrapper[29612]: I0319 12:12:41.334742 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c867fd8bf-nt5cs"] Mar 19 12:12:41.479225 master-0 kubenswrapper[29612]: I0319 12:12:41.479134 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-console-config\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.479435 master-0 kubenswrapper[29612]: I0319 12:12:41.479242 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-service-ca\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.479435 master-0 kubenswrapper[29612]: I0319 12:12:41.479297 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.479435 master-0 kubenswrapper[29612]: I0319 12:12:41.479340 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-oauth-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.479549 master-0 kubenswrapper[29612]: I0319 12:12:41.479454 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctbzp\" (UniqueName: \"kubernetes.io/projected/7a14879e-b284-42e9-aea1-818ff991f2d8-kube-api-access-ctbzp\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.479549 master-0 kubenswrapper[29612]: I0319 12:12:41.479515 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-oauth-config\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.581550 master-0 kubenswrapper[29612]: I0319 12:12:41.581406 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-oauth-config\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.582321 master-0 kubenswrapper[29612]: I0319 12:12:41.581856 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-console-config\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.582321 master-0 kubenswrapper[29612]: I0319 12:12:41.581994 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-service-ca\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.582321 master-0 kubenswrapper[29612]: I0319 12:12:41.582100 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.582476 master-0 kubenswrapper[29612]: E0319 12:12:41.582400 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:41.582589 master-0 kubenswrapper[29612]: E0319 12:12:41.582549 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert podName:7a14879e-b284-42e9-aea1-818ff991f2d8 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:42.082513711 +0000 UTC m=+169.396682732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert") pod "console-5c867fd8bf-nt5cs" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8") : secret "console-serving-cert" not found Mar 19 12:12:41.582740 master-0 kubenswrapper[29612]: I0319 12:12:41.582690 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-oauth-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.582801 master-0 kubenswrapper[29612]: I0319 12:12:41.582771 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctbzp\" (UniqueName: \"kubernetes.io/projected/7a14879e-b284-42e9-aea1-818ff991f2d8-kube-api-access-ctbzp\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.583278 master-0 kubenswrapper[29612]: I0319 12:12:41.583154 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-service-ca\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.583278 master-0 kubenswrapper[29612]: I0319 12:12:41.583230 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-console-config\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.587154 master-0 kubenswrapper[29612]: I0319 12:12:41.587096 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-oauth-config\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.587246 master-0 kubenswrapper[29612]: I0319 12:12:41.587203 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-oauth-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:41.601692 master-0 kubenswrapper[29612]: I0319 12:12:41.601657 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctbzp\" (UniqueName: \"kubernetes.io/projected/7a14879e-b284-42e9-aea1-818ff991f2d8-kube-api-access-ctbzp\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:42.089824 master-0 kubenswrapper[29612]: I0319 12:12:42.089773 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:42.090121 master-0 kubenswrapper[29612]: E0319 12:12:42.089943 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:42.090121 master-0 kubenswrapper[29612]: E0319 12:12:42.090006 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert podName:7a14879e-b284-42e9-aea1-818ff991f2d8 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:43.089989911 +0000 UTC m=+170.404158932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert") pod "console-5c867fd8bf-nt5cs" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8") : secret "console-serving-cert" not found Mar 19 12:12:43.103297 master-0 kubenswrapper[29612]: I0319 12:12:43.103248 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:43.104220 master-0 kubenswrapper[29612]: E0319 12:12:43.104202 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:43.104345 master-0 kubenswrapper[29612]: E0319 12:12:43.104331 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert podName:7a14879e-b284-42e9-aea1-818ff991f2d8 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:45.104310012 +0000 UTC m=+172.418479033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert") pod "console-5c867fd8bf-nt5cs" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8") : secret "console-serving-cert" not found Mar 19 12:12:43.904917 master-0 kubenswrapper[29612]: I0319 12:12:43.904865 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-9966f847b-gzmrl"] Mar 19 12:12:43.906129 master-0 kubenswrapper[29612]: I0319 12:12:43.906106 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:43.915200 master-0 kubenswrapper[29612]: I0319 12:12:43.915166 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 12:12:43.928128 master-0 kubenswrapper[29612]: I0319 12:12:43.928078 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9966f847b-gzmrl"] Mar 19 12:12:44.017854 master-0 kubenswrapper[29612]: I0319 12:12:44.017755 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-oauth-config\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.017854 master-0 kubenswrapper[29612]: I0319 12:12:44.017841 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssft2\" (UniqueName: \"kubernetes.io/projected/493d3830-b0a6-4a3a-8048-0a6aed134ebf-kube-api-access-ssft2\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.018216 master-0 kubenswrapper[29612]: I0319 12:12:44.017936 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-oauth-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.018216 master-0 kubenswrapper[29612]: I0319 12:12:44.018006 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-config\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.018216 master-0 kubenswrapper[29612]: I0319 12:12:44.018114 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.018216 master-0 kubenswrapper[29612]: I0319 12:12:44.018143 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-service-ca\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.018216 master-0 kubenswrapper[29612]: I0319 12:12:44.018175 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-trusted-ca-bundle\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.119838 master-0 kubenswrapper[29612]: I0319 12:12:44.119756 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.119838 master-0 kubenswrapper[29612]: I0319 12:12:44.119833 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-service-ca\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.120542 master-0 kubenswrapper[29612]: E0319 12:12:44.120021 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:44.120542 master-0 kubenswrapper[29612]: I0319 12:12:44.120065 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-trusted-ca-bundle\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.120542 master-0 kubenswrapper[29612]: E0319 12:12:44.120120 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert podName:493d3830-b0a6-4a3a-8048-0a6aed134ebf nodeName:}" failed. No retries permitted until 2026-03-19 12:12:44.620094564 +0000 UTC m=+171.934263655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert") pod "console-9966f847b-gzmrl" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf") : secret "console-serving-cert" not found Mar 19 12:12:44.120542 master-0 kubenswrapper[29612]: I0319 12:12:44.120400 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-oauth-config\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.120542 master-0 kubenswrapper[29612]: I0319 12:12:44.120440 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssft2\" (UniqueName: \"kubernetes.io/projected/493d3830-b0a6-4a3a-8048-0a6aed134ebf-kube-api-access-ssft2\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.120978 master-0 kubenswrapper[29612]: I0319 12:12:44.120764 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-oauth-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.120978 master-0 kubenswrapper[29612]: I0319 12:12:44.120807 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-config\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.120978 master-0 kubenswrapper[29612]: I0319 12:12:44.120968 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-service-ca\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.122422 master-0 kubenswrapper[29612]: I0319 12:12:44.121592 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-trusted-ca-bundle\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.122422 master-0 kubenswrapper[29612]: I0319 12:12:44.121608 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-config\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.122422 master-0 kubenswrapper[29612]: I0319 12:12:44.121849 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-oauth-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.125916 master-0 kubenswrapper[29612]: I0319 12:12:44.125847 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-oauth-config\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.137810 master-0 kubenswrapper[29612]: I0319 12:12:44.137745 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssft2\" (UniqueName: \"kubernetes.io/projected/493d3830-b0a6-4a3a-8048-0a6aed134ebf-kube-api-access-ssft2\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.628409 master-0 kubenswrapper[29612]: I0319 12:12:44.628345 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:44.628689 master-0 kubenswrapper[29612]: E0319 12:12:44.628623 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:44.628743 master-0 kubenswrapper[29612]: E0319 12:12:44.628715 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert podName:493d3830-b0a6-4a3a-8048-0a6aed134ebf nodeName:}" failed. No retries permitted until 2026-03-19 12:12:45.628691946 +0000 UTC m=+172.942860967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert") pod "console-9966f847b-gzmrl" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf") : secret "console-serving-cert" not found Mar 19 12:12:45.137594 master-0 kubenswrapper[29612]: I0319 12:12:45.137481 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:45.138258 master-0 kubenswrapper[29612]: E0319 12:12:45.137973 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:45.138401 master-0 kubenswrapper[29612]: E0319 12:12:45.138347 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert podName:7a14879e-b284-42e9-aea1-818ff991f2d8 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:49.138318547 +0000 UTC m=+176.452487568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert") pod "console-5c867fd8bf-nt5cs" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8") : secret "console-serving-cert" not found Mar 19 12:12:45.646066 master-0 kubenswrapper[29612]: I0319 12:12:45.645996 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:45.646352 master-0 kubenswrapper[29612]: E0319 12:12:45.646174 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:45.646352 master-0 kubenswrapper[29612]: E0319 12:12:45.646244 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert podName:493d3830-b0a6-4a3a-8048-0a6aed134ebf nodeName:}" failed. No retries permitted until 2026-03-19 12:12:47.646223549 +0000 UTC m=+174.960392580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert") pod "console-9966f847b-gzmrl" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf") : secret "console-serving-cert" not found Mar 19 12:12:47.675463 master-0 kubenswrapper[29612]: I0319 12:12:47.675389 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:47.676058 master-0 kubenswrapper[29612]: E0319 12:12:47.675694 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:47.676058 master-0 kubenswrapper[29612]: E0319 12:12:47.675792 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert podName:493d3830-b0a6-4a3a-8048-0a6aed134ebf nodeName:}" failed. No retries permitted until 2026-03-19 12:12:51.675774877 +0000 UTC m=+178.989943898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert") pod "console-9966f847b-gzmrl" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf") : secret "console-serving-cert" not found Mar 19 12:12:49.197917 master-0 kubenswrapper[29612]: I0319 12:12:49.197855 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:49.198722 master-0 kubenswrapper[29612]: E0319 12:12:49.198033 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:49.198722 master-0 kubenswrapper[29612]: E0319 12:12:49.198112 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert podName:7a14879e-b284-42e9-aea1-818ff991f2d8 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:57.198090663 +0000 UTC m=+184.512259734 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert") pod "console-5c867fd8bf-nt5cs" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8") : secret "console-serving-cert" not found Mar 19 12:12:51.183811 master-0 kubenswrapper[29612]: I0319 12:12:51.182166 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8dc697b97-m8zhf"] Mar 19 12:12:51.184645 master-0 kubenswrapper[29612]: I0319 12:12:51.184593 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.191029 master-0 kubenswrapper[29612]: I0319 12:12:51.190972 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 12:12:51.192328 master-0 kubenswrapper[29612]: I0319 12:12:51.192303 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 12:12:51.192449 master-0 kubenswrapper[29612]: I0319 12:12:51.192430 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 12:12:51.193734 master-0 kubenswrapper[29612]: I0319 12:12:51.192899 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 12:12:51.194135 master-0 kubenswrapper[29612]: I0319 12:12:51.194042 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 12:12:51.194240 master-0 kubenswrapper[29612]: I0319 12:12:51.194206 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 12:12:51.194477 master-0 kubenswrapper[29612]: I0319 12:12:51.194450 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 12:12:51.195327 master-0 kubenswrapper[29612]: I0319 12:12:51.195290 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 12:12:51.195555 master-0 kubenswrapper[29612]: I0319 12:12:51.195436 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-nbxfb" Mar 19 12:12:51.195869 master-0 kubenswrapper[29612]: I0319 12:12:51.195646 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 12:12:51.197468 master-0 kubenswrapper[29612]: I0319 12:12:51.197421 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 12:12:51.199976 master-0 kubenswrapper[29612]: I0319 12:12:51.199924 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 12:12:51.200323 master-0 kubenswrapper[29612]: I0319 12:12:51.200296 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 12:12:51.208677 master-0 kubenswrapper[29612]: I0319 12:12:51.208622 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 12:12:51.211322 master-0 kubenswrapper[29612]: I0319 12:12:51.211275 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8dc697b97-m8zhf"] Mar 19 12:12:51.330628 master-0 kubenswrapper[29612]: I0319 12:12:51.330566 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.330880 master-0 kubenswrapper[29612]: I0319 12:12:51.330800 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-error\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.330935 master-0 kubenswrapper[29612]: I0319 12:12:51.330912 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-router-certs\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.330984 master-0 kubenswrapper[29612]: I0319 12:12:51.330943 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-service-ca\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331031 master-0 kubenswrapper[29612]: I0319 12:12:51.330985 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-policies\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331031 master-0 kubenswrapper[29612]: I0319 12:12:51.331016 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331128 master-0 kubenswrapper[29612]: I0319 12:12:51.331059 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-login\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331128 master-0 kubenswrapper[29612]: I0319 12:12:51.331120 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331212 master-0 kubenswrapper[29612]: I0319 12:12:51.331147 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjpnl\" (UniqueName: \"kubernetes.io/projected/4560e0cf-d401-4508-a8d8-3869ed30ed18-kube-api-access-zjpnl\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331212 master-0 kubenswrapper[29612]: I0319 12:12:51.331186 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-dir\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331303 master-0 kubenswrapper[29612]: I0319 12:12:51.331267 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331372 master-0 kubenswrapper[29612]: I0319 12:12:51.331337 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.331426 master-0 kubenswrapper[29612]: I0319 12:12:51.331400 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.432852 master-0 kubenswrapper[29612]: I0319 12:12:51.432787 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433102 master-0 kubenswrapper[29612]: I0319 12:12:51.433013 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433102 master-0 kubenswrapper[29612]: I0319 12:12:51.433056 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433413 master-0 kubenswrapper[29612]: I0319 12:12:51.433349 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-error\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433552 master-0 kubenswrapper[29612]: I0319 12:12:51.433482 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-router-certs\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433552 master-0 kubenswrapper[29612]: I0319 12:12:51.433514 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-service-ca\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433755 master-0 kubenswrapper[29612]: I0319 12:12:51.433553 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-policies\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433755 master-0 kubenswrapper[29612]: I0319 12:12:51.433585 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433755 master-0 kubenswrapper[29612]: I0319 12:12:51.433618 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-login\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433755 master-0 kubenswrapper[29612]: I0319 12:12:51.433679 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433755 master-0 kubenswrapper[29612]: I0319 12:12:51.433706 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjpnl\" (UniqueName: \"kubernetes.io/projected/4560e0cf-d401-4508-a8d8-3869ed30ed18-kube-api-access-zjpnl\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.433755 master-0 kubenswrapper[29612]: I0319 12:12:51.433740 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-dir\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.434047 master-0 kubenswrapper[29612]: I0319 12:12:51.433773 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.434047 master-0 kubenswrapper[29612]: E0319 12:12:51.433857 29612 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:51.434047 master-0 kubenswrapper[29612]: E0319 12:12:51.433926 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:51.933904862 +0000 UTC m=+179.248073883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:51.434047 master-0 kubenswrapper[29612]: E0319 12:12:51.433968 29612 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 19 12:12:51.434047 master-0 kubenswrapper[29612]: E0319 12:12:51.434036 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:51.934018495 +0000 UTC m=+179.248187536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : secret "v4-0-config-system-session" not found Mar 19 12:12:51.434265 master-0 kubenswrapper[29612]: I0319 12:12:51.433954 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-dir\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.435293 master-0 kubenswrapper[29612]: I0319 12:12:51.435130 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-policies\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.435293 master-0 kubenswrapper[29612]: I0319 12:12:51.435253 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.435406 master-0 kubenswrapper[29612]: I0319 12:12:51.435283 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-service-ca\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.437911 master-0 kubenswrapper[29612]: I0319 12:12:51.436670 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-error\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.437911 master-0 kubenswrapper[29612]: I0319 12:12:51.437621 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.437911 master-0 kubenswrapper[29612]: I0319 12:12:51.437705 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-router-certs\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.438121 master-0 kubenswrapper[29612]: I0319 12:12:51.437946 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-login\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.440555 master-0 kubenswrapper[29612]: I0319 12:12:51.440480 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.443218 master-0 kubenswrapper[29612]: I0319 12:12:51.443137 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.459265 master-0 kubenswrapper[29612]: I0319 12:12:51.459194 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjpnl\" (UniqueName: \"kubernetes.io/projected/4560e0cf-d401-4508-a8d8-3869ed30ed18-kube-api-access-zjpnl\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.738049 master-0 kubenswrapper[29612]: I0319 12:12:51.737930 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:51.738385 master-0 kubenswrapper[29612]: E0319 12:12:51.738320 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:51.738620 master-0 kubenswrapper[29612]: E0319 12:12:51.738544 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert podName:493d3830-b0a6-4a3a-8048-0a6aed134ebf nodeName:}" failed. No retries permitted until 2026-03-19 12:12:59.738515447 +0000 UTC m=+187.052684508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert") pod "console-9966f847b-gzmrl" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf") : secret "console-serving-cert" not found Mar 19 12:12:51.941356 master-0 kubenswrapper[29612]: I0319 12:12:51.940848 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.941356 master-0 kubenswrapper[29612]: I0319 12:12:51.940925 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:51.941356 master-0 kubenswrapper[29612]: E0319 12:12:51.941029 29612 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:51.941356 master-0 kubenswrapper[29612]: E0319 12:12:51.941116 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:52.941095064 +0000 UTC m=+180.255264085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:51.941356 master-0 kubenswrapper[29612]: E0319 12:12:51.941151 29612 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 19 12:12:51.941356 master-0 kubenswrapper[29612]: E0319 12:12:51.941213 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:52.941197276 +0000 UTC m=+180.255366367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : secret "v4-0-config-system-session" not found Mar 19 12:12:52.962377 master-0 kubenswrapper[29612]: I0319 12:12:52.962333 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:52.962770 master-0 kubenswrapper[29612]: I0319 12:12:52.962400 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:52.963718 master-0 kubenswrapper[29612]: E0319 12:12:52.963691 29612 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:52.963786 master-0 kubenswrapper[29612]: E0319 12:12:52.963752 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:54.963731932 +0000 UTC m=+182.277900953 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:52.964222 master-0 kubenswrapper[29612]: E0319 12:12:52.964201 29612 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 19 12:12:52.964291 master-0 kubenswrapper[29612]: E0319 12:12:52.964277 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:54.964230666 +0000 UTC m=+182.278399687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : secret "v4-0-config-system-session" not found Mar 19 12:12:54.992869 master-0 kubenswrapper[29612]: I0319 12:12:54.992799 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:54.992869 master-0 kubenswrapper[29612]: I0319 12:12:54.992879 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:54.993550 master-0 kubenswrapper[29612]: E0319 12:12:54.993079 29612 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:54.993550 master-0 kubenswrapper[29612]: E0319 12:12:54.993193 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig podName:4560e0cf-d401-4508-a8d8-3869ed30ed18 nodeName:}" failed. No retries permitted until 2026-03-19 12:12:58.993170866 +0000 UTC m=+186.307339947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig") pod "oauth-openshift-8dc697b97-m8zhf" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18") : configmap "v4-0-config-system-cliconfig" not found Mar 19 12:12:54.997808 master-0 kubenswrapper[29612]: I0319 12:12:54.997758 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:57.235185 master-0 kubenswrapper[29612]: I0319 12:12:57.235101 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:12:57.235797 master-0 kubenswrapper[29612]: E0319 12:12:57.235471 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:57.235797 master-0 kubenswrapper[29612]: E0319 12:12:57.235568 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert podName:7a14879e-b284-42e9-aea1-818ff991f2d8 nodeName:}" failed. No retries permitted until 2026-03-19 12:13:13.235548793 +0000 UTC m=+200.549717814 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert") pod "console-5c867fd8bf-nt5cs" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8") : secret "console-serving-cert" not found Mar 19 12:12:59.061448 master-0 kubenswrapper[29612]: I0319 12:12:59.061361 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:59.062231 master-0 kubenswrapper[29612]: I0319 12:12:59.062198 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc697b97-m8zhf\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:59.325286 master-0 kubenswrapper[29612]: I0319 12:12:59.325152 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-nbxfb" Mar 19 12:12:59.333856 master-0 kubenswrapper[29612]: I0319 12:12:59.333785 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:12:59.772053 master-0 kubenswrapper[29612]: I0319 12:12:59.771937 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:12:59.772594 master-0 kubenswrapper[29612]: E0319 12:12:59.772165 29612 secret.go:189] Couldn't get secret openshift-console/console-serving-cert: secret "console-serving-cert" not found Mar 19 12:12:59.772594 master-0 kubenswrapper[29612]: E0319 12:12:59.772270 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert podName:493d3830-b0a6-4a3a-8048-0a6aed134ebf nodeName:}" failed. No retries permitted until 2026-03-19 12:13:15.772246691 +0000 UTC m=+203.086415712 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert") pod "console-9966f847b-gzmrl" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf") : secret "console-serving-cert" not found Mar 19 12:12:59.814919 master-0 kubenswrapper[29612]: I0319 12:12:59.814855 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8dc697b97-m8zhf"] Mar 19 12:13:00.086223 master-0 kubenswrapper[29612]: I0319 12:13:00.086056 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" event={"ID":"4560e0cf-d401-4508-a8d8-3869ed30ed18","Type":"ContainerStarted","Data":"91f80a13c9cca1e264585bceb12112c9de2785e238c83ec620b4cf692f9cd4a5"} Mar 19 12:13:01.208377 master-0 kubenswrapper[29612]: I0319 12:13:01.208311 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-8dc697b97-m8zhf"] Mar 19 12:13:09.421940 master-0 kubenswrapper[29612]: I0319 12:13:09.421847 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-59bc46f698-rsj74"] Mar 19 12:13:09.424297 master-0 kubenswrapper[29612]: I0319 12:13:09.424225 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.426209 master-0 kubenswrapper[29612]: I0319 12:13:09.426168 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 12:13:09.426486 master-0 kubenswrapper[29612]: I0319 12:13:09.426462 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 12:13:09.427919 master-0 kubenswrapper[29612]: I0319 12:13:09.427893 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-dkps1gqvgkpdc" Mar 19 12:13:09.428214 master-0 kubenswrapper[29612]: I0319 12:13:09.428066 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 12:13:09.428214 master-0 kubenswrapper[29612]: I0319 12:13:09.428197 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 12:13:09.429483 master-0 kubenswrapper[29612]: I0319 12:13:09.428329 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 12:13:09.440055 master-0 kubenswrapper[29612]: I0319 12:13:09.439966 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-59bc46f698-rsj74"] Mar 19 12:13:09.564568 master-0 kubenswrapper[29612]: I0319 12:13:09.564500 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564568 master-0 kubenswrapper[29612]: I0319 12:13:09.564549 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6ff\" (UniqueName: \"kubernetes.io/projected/ba84400d-9694-4ea6-87f3-ea29910b8904-kube-api-access-km6ff\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564568 master-0 kubenswrapper[29612]: I0319 12:13:09.564569 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564966 master-0 kubenswrapper[29612]: I0319 12:13:09.564591 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-tls\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564966 master-0 kubenswrapper[29612]: I0319 12:13:09.564613 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564966 master-0 kubenswrapper[29612]: I0319 12:13:09.564637 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-grpc-tls\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564966 master-0 kubenswrapper[29612]: I0319 12:13:09.564657 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba84400d-9694-4ea6-87f3-ea29910b8904-metrics-client-ca\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.564966 master-0 kubenswrapper[29612]: I0319 12:13:09.564795 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.666529 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba84400d-9694-4ea6-87f3-ea29910b8904-metrics-client-ca\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.666820 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667038 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667070 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km6ff\" (UniqueName: \"kubernetes.io/projected/ba84400d-9694-4ea6-87f3-ea29910b8904-kube-api-access-km6ff\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667094 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667128 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-tls\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667167 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667199 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-grpc-tls\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.667728 master-0 kubenswrapper[29612]: I0319 12:13:09.667702 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba84400d-9694-4ea6-87f3-ea29910b8904-metrics-client-ca\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.671382 master-0 kubenswrapper[29612]: I0319 12:13:09.671148 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.672486 master-0 kubenswrapper[29612]: I0319 12:13:09.671596 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-grpc-tls\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.676837 master-0 kubenswrapper[29612]: I0319 12:13:09.672873 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.676837 master-0 kubenswrapper[29612]: I0319 12:13:09.673043 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-tls\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.676837 master-0 kubenswrapper[29612]: I0319 12:13:09.676474 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.683244 master-0 kubenswrapper[29612]: I0319 12:13:09.683189 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba84400d-9694-4ea6-87f3-ea29910b8904-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.706169 master-0 kubenswrapper[29612]: I0319 12:13:09.706102 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km6ff\" (UniqueName: \"kubernetes.io/projected/ba84400d-9694-4ea6-87f3-ea29910b8904-kube-api-access-km6ff\") pod \"thanos-querier-59bc46f698-rsj74\" (UID: \"ba84400d-9694-4ea6-87f3-ea29910b8904\") " pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:09.766953 master-0 kubenswrapper[29612]: I0319 12:13:09.766900 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:12.212218 master-0 kubenswrapper[29612]: I0319 12:13:12.209422 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw"] Mar 19 12:13:12.212218 master-0 kubenswrapper[29612]: I0319 12:13:12.210773 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.218818 master-0 kubenswrapper[29612]: I0319 12:13:12.217534 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-e7dkr757t87sf" Mar 19 12:13:12.235869 master-0 kubenswrapper[29612]: I0319 12:13:12.235823 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw"] Mar 19 12:13:12.249791 master-0 kubenswrapper[29612]: I0319 12:13:12.249733 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-dc85487cf-gsmlh"] Mar 19 12:13:12.250010 master-0 kubenswrapper[29612]: I0319 12:13:12.249978 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" podUID="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" containerName="metrics-server" containerID="cri-o://5b15f71769cb166bbe1854fa275ab6231d488a761919449830f9713c4bfd28ed" gracePeriod=170 Mar 19 12:13:12.320382 master-0 kubenswrapper[29612]: I0319 12:13:12.320300 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-secret-metrics-server-tls\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.320613 master-0 kubenswrapper[29612]: I0319 12:13:12.320509 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/82f22257-bebd-4b1f-a92f-4c9d805367c2-audit-log\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.320727 master-0 kubenswrapper[29612]: I0319 12:13:12.320638 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/82f22257-bebd-4b1f-a92f-4c9d805367c2-metrics-server-audit-profiles\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.320727 master-0 kubenswrapper[29612]: I0319 12:13:12.320701 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vklx\" (UniqueName: \"kubernetes.io/projected/82f22257-bebd-4b1f-a92f-4c9d805367c2-kube-api-access-8vklx\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.320823 master-0 kubenswrapper[29612]: I0319 12:13:12.320728 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-secret-metrics-client-certs\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.320823 master-0 kubenswrapper[29612]: I0319 12:13:12.320780 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82f22257-bebd-4b1f-a92f-4c9d805367c2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.320918 master-0 kubenswrapper[29612]: I0319 12:13:12.320858 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-client-ca-bundle\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.411621 master-0 kubenswrapper[29612]: I0319 12:13:12.410870 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-w7wzn"] Mar 19 12:13:12.412122 master-0 kubenswrapper[29612]: I0319 12:13:12.412088 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.413610 master-0 kubenswrapper[29612]: I0319 12:13:12.413565 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 12:13:12.414221 master-0 kubenswrapper[29612]: I0319 12:13:12.414197 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 12:13:12.414303 master-0 kubenswrapper[29612]: I0319 12:13:12.414203 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 12:13:12.414840 master-0 kubenswrapper[29612]: I0319 12:13:12.414766 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 12:13:12.415089 master-0 kubenswrapper[29612]: I0319 12:13:12.414942 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 12:13:12.421287 master-0 kubenswrapper[29612]: I0319 12:13:12.421089 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 12:13:12.422071 master-0 kubenswrapper[29612]: I0319 12:13:12.421880 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-client-ca-bundle\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422071 master-0 kubenswrapper[29612]: I0319 12:13:12.421922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-secret-metrics-server-tls\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422071 master-0 kubenswrapper[29612]: I0319 12:13:12.421966 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/82f22257-bebd-4b1f-a92f-4c9d805367c2-audit-log\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422071 master-0 kubenswrapper[29612]: I0319 12:13:12.422007 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/82f22257-bebd-4b1f-a92f-4c9d805367c2-metrics-server-audit-profiles\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422477 master-0 kubenswrapper[29612]: I0319 12:13:12.422447 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/82f22257-bebd-4b1f-a92f-4c9d805367c2-audit-log\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422542 master-0 kubenswrapper[29612]: I0319 12:13:12.422502 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vklx\" (UniqueName: \"kubernetes.io/projected/82f22257-bebd-4b1f-a92f-4c9d805367c2-kube-api-access-8vklx\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422585 master-0 kubenswrapper[29612]: I0319 12:13:12.422551 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-secret-metrics-client-certs\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.422617 master-0 kubenswrapper[29612]: I0319 12:13:12.422587 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82f22257-bebd-4b1f-a92f-4c9d805367c2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.429266 master-0 kubenswrapper[29612]: I0319 12:13:12.429230 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-client-ca-bundle\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.430177 master-0 kubenswrapper[29612]: I0319 12:13:12.430126 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-w7wzn"] Mar 19 12:13:12.431743 master-0 kubenswrapper[29612]: I0319 12:13:12.431610 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-secret-metrics-server-tls\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.437008 master-0 kubenswrapper[29612]: I0319 12:13:12.436921 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82f22257-bebd-4b1f-a92f-4c9d805367c2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.439542 master-0 kubenswrapper[29612]: I0319 12:13:12.439499 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/82f22257-bebd-4b1f-a92f-4c9d805367c2-metrics-server-audit-profiles\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.440097 master-0 kubenswrapper[29612]: I0319 12:13:12.440065 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/82f22257-bebd-4b1f-a92f-4c9d805367c2-secret-metrics-client-certs\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.444934 master-0 kubenswrapper[29612]: I0319 12:13:12.444899 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vklx\" (UniqueName: \"kubernetes.io/projected/82f22257-bebd-4b1f-a92f-4c9d805367c2-kube-api-access-8vklx\") pod \"metrics-server-6ddfc7c8d9-hftjw\" (UID: \"82f22257-bebd-4b1f-a92f-4c9d805367c2\") " pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.524127 master-0 kubenswrapper[29612]: I0319 12:13:12.523987 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gqjj\" (UniqueName: \"kubernetes.io/projected/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-kube-api-access-7gqjj\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524127 master-0 kubenswrapper[29612]: I0319 12:13:12.524061 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524127 master-0 kubenswrapper[29612]: I0319 12:13:12.524092 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524127 master-0 kubenswrapper[29612]: I0319 12:13:12.524116 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-telemeter-client-tls\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524468 master-0 kubenswrapper[29612]: I0319 12:13:12.524147 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-metrics-client-ca\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524468 master-0 kubenswrapper[29612]: I0319 12:13:12.524165 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-serving-certs-ca-bundle\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524468 master-0 kubenswrapper[29612]: I0319 12:13:12.524201 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-secret-telemeter-client\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.524468 master-0 kubenswrapper[29612]: I0319 12:13:12.524237 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-federate-client-tls\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.548559 master-0 kubenswrapper[29612]: I0319 12:13:12.543559 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:12.625442 master-0 kubenswrapper[29612]: I0319 12:13:12.625370 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-secret-telemeter-client\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625473 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-federate-client-tls\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625516 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gqjj\" (UniqueName: \"kubernetes.io/projected/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-kube-api-access-7gqjj\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625576 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625611 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625664 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-telemeter-client-tls\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625693 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-metrics-client-ca\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.625711 master-0 kubenswrapper[29612]: I0319 12:13:12.625717 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-serving-certs-ca-bundle\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.627064 master-0 kubenswrapper[29612]: I0319 12:13:12.626863 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-serving-certs-ca-bundle\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.627403 master-0 kubenswrapper[29612]: I0319 12:13:12.627372 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-metrics-client-ca\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.627892 master-0 kubenswrapper[29612]: I0319 12:13:12.627853 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.629966 master-0 kubenswrapper[29612]: I0319 12:13:12.629927 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-federate-client-tls\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.631219 master-0 kubenswrapper[29612]: I0319 12:13:12.631185 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-secret-telemeter-client\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.632168 master-0 kubenswrapper[29612]: I0319 12:13:12.632128 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.633117 master-0 kubenswrapper[29612]: I0319 12:13:12.633094 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-telemeter-client-tls\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.642778 master-0 kubenswrapper[29612]: I0319 12:13:12.642727 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gqjj\" (UniqueName: \"kubernetes.io/projected/3b939bd4-2c55-4819-a6ef-1a79f91d49a7-kube-api-access-7gqjj\") pod \"telemeter-client-5b685db9f8-w7wzn\" (UID: \"3b939bd4-2c55-4819-a6ef-1a79f91d49a7\") " pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:12.789670 master-0 kubenswrapper[29612]: I0319 12:13:12.789544 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" Mar 19 12:13:13.337608 master-0 kubenswrapper[29612]: I0319 12:13:13.337549 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:13:13.342420 master-0 kubenswrapper[29612]: I0319 12:13:13.342376 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"console-5c867fd8bf-nt5cs\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:13:13.474731 master-0 kubenswrapper[29612]: I0319 12:13:13.474679 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-mln6s" Mar 19 12:13:13.483806 master-0 kubenswrapper[29612]: I0319 12:13:13.483733 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:13:15.785503 master-0 kubenswrapper[29612]: I0319 12:13:15.785433 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:13:15.790101 master-0 kubenswrapper[29612]: I0319 12:13:15.790062 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"console-9966f847b-gzmrl\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:13:16.026675 master-0 kubenswrapper[29612]: I0319 12:13:16.026603 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:13:16.879210 master-0 kubenswrapper[29612]: I0319 12:13:16.879025 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-59bc46f698-rsj74"] Mar 19 12:13:16.883201 master-0 kubenswrapper[29612]: W0319 12:13:16.883104 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba84400d_9694_4ea6_87f3_ea29910b8904.slice/crio-67c53a2b3e17bbe87db295855f49d5775bf4d0e92e1481055d52ab3139977d02 WatchSource:0}: Error finding container 67c53a2b3e17bbe87db295855f49d5775bf4d0e92e1481055d52ab3139977d02: Status 404 returned error can't find the container with id 67c53a2b3e17bbe87db295855f49d5775bf4d0e92e1481055d52ab3139977d02 Mar 19 12:13:17.007094 master-0 kubenswrapper[29612]: I0319 12:13:17.007052 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw"] Mar 19 12:13:17.016058 master-0 kubenswrapper[29612]: I0319 12:13:17.016015 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9966f847b-gzmrl"] Mar 19 12:13:17.023118 master-0 kubenswrapper[29612]: I0319 12:13:17.023058 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c867fd8bf-nt5cs"] Mar 19 12:13:17.148796 master-0 kubenswrapper[29612]: I0319 12:13:17.148736 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5b685db9f8-w7wzn"] Mar 19 12:13:17.151983 master-0 kubenswrapper[29612]: W0319 12:13:17.151240 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b939bd4_2c55_4819_a6ef_1a79f91d49a7.slice/crio-20b61b990083828e1ac90c5610693022183ec19cb56c83c61264b644609912d9 WatchSource:0}: Error finding container 20b61b990083828e1ac90c5610693022183ec19cb56c83c61264b644609912d9: Status 404 returned error can't find the container with id 20b61b990083828e1ac90c5610693022183ec19cb56c83c61264b644609912d9 Mar 19 12:13:17.211440 master-0 kubenswrapper[29612]: I0319 12:13:17.211261 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" event={"ID":"82f22257-bebd-4b1f-a92f-4c9d805367c2","Type":"ContainerStarted","Data":"e593499d8aec68e9dc85796cf5b9a805ac70a683270b3ad9a3b0b551399a2db2"} Mar 19 12:13:17.211440 master-0 kubenswrapper[29612]: I0319 12:13:17.211311 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" event={"ID":"82f22257-bebd-4b1f-a92f-4c9d805367c2","Type":"ContainerStarted","Data":"8d0de8c4b6f892e12cd981b543d1e17fb2c0c5426f014b24c76daba9a4b8edbf"} Mar 19 12:13:17.212993 master-0 kubenswrapper[29612]: I0319 12:13:17.212950 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9966f847b-gzmrl" event={"ID":"493d3830-b0a6-4a3a-8048-0a6aed134ebf","Type":"ContainerStarted","Data":"eca6da7b88ba11ba6db8a5731c1c758b7cb8dab5164aa34c66e3439a2b03b88c"} Mar 19 12:13:17.213803 master-0 kubenswrapper[29612]: I0319 12:13:17.213639 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c867fd8bf-nt5cs" event={"ID":"7a14879e-b284-42e9-aea1-818ff991f2d8","Type":"ContainerStarted","Data":"c71e8c9d0e96d387f0effdeef0c1eaf1190bfb1f3bf31f89ea702dd75edbb251"} Mar 19 12:13:17.215101 master-0 kubenswrapper[29612]: I0319 12:13:17.215051 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"67c53a2b3e17bbe87db295855f49d5775bf4d0e92e1481055d52ab3139977d02"} Mar 19 12:13:17.216480 master-0 kubenswrapper[29612]: I0319 12:13:17.216438 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerStarted","Data":"20b61b990083828e1ac90c5610693022183ec19cb56c83c61264b644609912d9"} Mar 19 12:13:17.219081 master-0 kubenswrapper[29612]: I0319 12:13:17.218780 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-66b8ffb895-9fpsq" event={"ID":"957aff58-940f-442b-8682-f8583562300a","Type":"ContainerStarted","Data":"9db3e2d42e725cbe414daa4b1be9dcbcb5e2e625fd4005603364431e56e78898"} Mar 19 12:13:17.221117 master-0 kubenswrapper[29612]: I0319 12:13:17.221090 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:13:17.221253 master-0 kubenswrapper[29612]: I0319 12:13:17.221172 29612 patch_prober.go:28] interesting pod/downloads-66b8ffb895-9fpsq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" start-of-body= Mar 19 12:13:17.221253 master-0 kubenswrapper[29612]: I0319 12:13:17.221201 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-9fpsq" podUID="957aff58-940f-442b-8682-f8583562300a" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" Mar 19 12:13:17.224916 master-0 kubenswrapper[29612]: I0319 12:13:17.224849 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" event={"ID":"4560e0cf-d401-4508-a8d8-3869ed30ed18","Type":"ContainerStarted","Data":"1971660b280bd0eb25c9f689cee9cc750f6518614ee8ed2aed577ed6c0ed4b85"} Mar 19 12:13:17.225188 master-0 kubenswrapper[29612]: I0319 12:13:17.225166 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:13:17.254759 master-0 kubenswrapper[29612]: I0319 12:13:17.254670 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" podStartSLOduration=5.254626673 podStartE2EDuration="5.254626673s" podCreationTimestamp="2026-03-19 12:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:13:17.24504091 +0000 UTC m=+204.559209931" watchObservedRunningTime="2026-03-19 12:13:17.254626673 +0000 UTC m=+204.568795704" Mar 19 12:13:17.273839 master-0 kubenswrapper[29612]: I0319 12:13:17.273760 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-66b8ffb895-9fpsq" podStartSLOduration=1.431807988 podStartE2EDuration="39.273736048s" podCreationTimestamp="2026-03-19 12:12:38 +0000 UTC" firstStartedPulling="2026-03-19 12:12:38.98225417 +0000 UTC m=+166.296423191" lastFinishedPulling="2026-03-19 12:13:16.82418223 +0000 UTC m=+204.138351251" observedRunningTime="2026-03-19 12:13:17.265250646 +0000 UTC m=+204.579419667" watchObservedRunningTime="2026-03-19 12:13:17.273736048 +0000 UTC m=+204.587905089" Mar 19 12:13:17.279580 master-0 kubenswrapper[29612]: I0319 12:13:17.279478 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:13:17.289578 master-0 kubenswrapper[29612]: I0319 12:13:17.289509 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" podStartSLOduration=9.643775951 podStartE2EDuration="26.289492117s" podCreationTimestamp="2026-03-19 12:12:51 +0000 UTC" firstStartedPulling="2026-03-19 12:12:59.812995503 +0000 UTC m=+187.127164524" lastFinishedPulling="2026-03-19 12:13:16.458711669 +0000 UTC m=+203.772880690" observedRunningTime="2026-03-19 12:13:17.28676976 +0000 UTC m=+204.600938801" watchObservedRunningTime="2026-03-19 12:13:17.289492117 +0000 UTC m=+204.603661138" Mar 19 12:13:18.247145 master-0 kubenswrapper[29612]: I0319 12:13:18.246333 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-w7wzn_3b939bd4-2c55-4819-a6ef-1a79f91d49a7/telemeter-client/0.log" Mar 19 12:13:18.247145 master-0 kubenswrapper[29612]: I0319 12:13:18.246405 29612 generic.go:334] "Generic (PLEG): container finished" podID="3b939bd4-2c55-4819-a6ef-1a79f91d49a7" containerID="08bfebfc5cf2af09d2c5041da935d6cc072e4fd96164854f369a3c92e7536800" exitCode=1 Mar 19 12:13:18.247145 master-0 kubenswrapper[29612]: I0319 12:13:18.247346 29612 patch_prober.go:28] interesting pod/downloads-66b8ffb895-9fpsq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" start-of-body= Mar 19 12:13:18.248569 master-0 kubenswrapper[29612]: I0319 12:13:18.247407 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-9fpsq" podUID="957aff58-940f-442b-8682-f8583562300a" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" Mar 19 12:13:18.248569 master-0 kubenswrapper[29612]: I0319 12:13:18.247348 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerStarted","Data":"dc1eead9be7158648a81628727625838c50bec49e4e5234e9da23e4ffd7c5efb"} Mar 19 12:13:18.248569 master-0 kubenswrapper[29612]: I0319 12:13:18.247961 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerStarted","Data":"67f0934be3e4e4a2d5168f8519c04420f9cc7167e0361f83a63256dc7dc6c120"} Mar 19 12:13:18.248569 master-0 kubenswrapper[29612]: I0319 12:13:18.247979 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerDied","Data":"08bfebfc5cf2af09d2c5041da935d6cc072e4fd96164854f369a3c92e7536800"} Mar 19 12:13:18.251188 master-0 kubenswrapper[29612]: I0319 12:13:18.251010 29612 scope.go:117] "RemoveContainer" containerID="08bfebfc5cf2af09d2c5041da935d6cc072e4fd96164854f369a3c92e7536800" Mar 19 12:13:18.568721 master-0 kubenswrapper[29612]: I0319 12:13:18.568446 29612 patch_prober.go:28] interesting pod/downloads-66b8ffb895-9fpsq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" start-of-body= Mar 19 12:13:18.568721 master-0 kubenswrapper[29612]: I0319 12:13:18.568518 29612 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-66b8ffb895-9fpsq" podUID="957aff58-940f-442b-8682-f8583562300a" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" Mar 19 12:13:18.569026 master-0 kubenswrapper[29612]: I0319 12:13:18.568754 29612 patch_prober.go:28] interesting pod/downloads-66b8ffb895-9fpsq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" start-of-body= Mar 19 12:13:18.569026 master-0 kubenswrapper[29612]: I0319 12:13:18.568853 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-9fpsq" podUID="957aff58-940f-442b-8682-f8583562300a" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" Mar 19 12:13:19.270585 master-0 kubenswrapper[29612]: I0319 12:13:19.270529 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-w7wzn_3b939bd4-2c55-4819-a6ef-1a79f91d49a7/telemeter-client/0.log" Mar 19 12:13:19.271494 master-0 kubenswrapper[29612]: I0319 12:13:19.271463 29612 patch_prober.go:28] interesting pod/downloads-66b8ffb895-9fpsq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" start-of-body= Mar 19 12:13:19.271573 master-0 kubenswrapper[29612]: I0319 12:13:19.271509 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-66b8ffb895-9fpsq" podUID="957aff58-940f-442b-8682-f8583562300a" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.97:8080/\": dial tcp 10.128.0.97:8080: connect: connection refused" Mar 19 12:13:19.271631 master-0 kubenswrapper[29612]: I0319 12:13:19.271579 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerStarted","Data":"428d6d7d95bed80e9283da6ff729941f2ee9b7a5398fd1877670023da3a806e6"} Mar 19 12:13:20.289085 master-0 kubenswrapper[29612]: I0319 12:13:20.289031 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-w7wzn_3b939bd4-2c55-4819-a6ef-1a79f91d49a7/telemeter-client/1.log" Mar 19 12:13:20.290727 master-0 kubenswrapper[29612]: I0319 12:13:20.290621 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-w7wzn_3b939bd4-2c55-4819-a6ef-1a79f91d49a7/telemeter-client/0.log" Mar 19 12:13:20.290727 master-0 kubenswrapper[29612]: I0319 12:13:20.290692 29612 generic.go:334] "Generic (PLEG): container finished" podID="3b939bd4-2c55-4819-a6ef-1a79f91d49a7" containerID="428d6d7d95bed80e9283da6ff729941f2ee9b7a5398fd1877670023da3a806e6" exitCode=1 Mar 19 12:13:20.290870 master-0 kubenswrapper[29612]: I0319 12:13:20.290725 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerDied","Data":"428d6d7d95bed80e9283da6ff729941f2ee9b7a5398fd1877670023da3a806e6"} Mar 19 12:13:20.290870 master-0 kubenswrapper[29612]: I0319 12:13:20.290764 29612 scope.go:117] "RemoveContainer" containerID="08bfebfc5cf2af09d2c5041da935d6cc072e4fd96164854f369a3c92e7536800" Mar 19 12:13:20.291275 master-0 kubenswrapper[29612]: I0319 12:13:20.291241 29612 scope.go:117] "RemoveContainer" containerID="428d6d7d95bed80e9283da6ff729941f2ee9b7a5398fd1877670023da3a806e6" Mar 19 12:13:20.291533 master-0 kubenswrapper[29612]: E0319 12:13:20.291501 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"telemeter-client\" with CrashLoopBackOff: \"back-off 10s restarting failed container=telemeter-client pod=telemeter-client-5b685db9f8-w7wzn_openshift-monitoring(3b939bd4-2c55-4819-a6ef-1a79f91d49a7)\"" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" podUID="3b939bd4-2c55-4819-a6ef-1a79f91d49a7" Mar 19 12:13:21.296384 master-0 kubenswrapper[29612]: I0319 12:13:21.296318 29612 scope.go:117] "RemoveContainer" containerID="428d6d7d95bed80e9283da6ff729941f2ee9b7a5398fd1877670023da3a806e6" Mar 19 12:13:21.297239 master-0 kubenswrapper[29612]: E0319 12:13:21.296612 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"telemeter-client\" with CrashLoopBackOff: \"back-off 10s restarting failed container=telemeter-client pod=telemeter-client-5b685db9f8-w7wzn_openshift-monitoring(3b939bd4-2c55-4819-a6ef-1a79f91d49a7)\"" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" podUID="3b939bd4-2c55-4819-a6ef-1a79f91d49a7" Mar 19 12:13:27.346800 master-0 kubenswrapper[29612]: I0319 12:13:27.346730 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-w7wzn_3b939bd4-2c55-4819-a6ef-1a79f91d49a7/telemeter-client/1.log" Mar 19 12:13:28.586511 master-0 kubenswrapper[29612]: I0319 12:13:28.586418 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-66b8ffb895-9fpsq" Mar 19 12:13:31.362754 master-0 kubenswrapper[29612]: I0319 12:13:31.362697 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c867fd8bf-nt5cs"] Mar 19 12:13:31.414209 master-0 kubenswrapper[29612]: I0319 12:13:31.414164 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"4f6c85acc6d56ff38ac0424a27b87165b5a0df8789fa929db7dc2676955faefb"} Mar 19 12:13:31.527928 master-0 kubenswrapper[29612]: I0319 12:13:31.527872 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-49r4b"] Mar 19 12:13:31.531563 master-0 kubenswrapper[29612]: I0319 12:13:31.531161 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.550730 master-0 kubenswrapper[29612]: I0319 12:13:31.550298 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-jj7tw" Mar 19 12:13:31.550730 master-0 kubenswrapper[29612]: I0319 12:13:31.550463 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 12:13:31.553756 master-0 kubenswrapper[29612]: I0319 12:13:31.553246 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85cf74699c-dx6hc"] Mar 19 12:13:31.554611 master-0 kubenswrapper[29612]: I0319 12:13:31.554094 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.620549 master-0 kubenswrapper[29612]: I0319 12:13:31.618092 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85cf74699c-dx6hc"] Mar 19 12:13:31.679190 master-0 kubenswrapper[29612]: I0319 12:13:31.679129 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzhbp\" (UniqueName: \"kubernetes.io/projected/f5e63326-e038-4f5d-b621-ae10091133be-kube-api-access-dzhbp\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.679388 master-0 kubenswrapper[29612]: I0319 12:13:31.679216 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-trusted-ca-bundle\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.679388 master-0 kubenswrapper[29612]: I0319 12:13:31.679249 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hft2\" (UniqueName: \"kubernetes.io/projected/dbc85eb5-feb1-46f8-b04c-e419c6300b18-kube-api-access-6hft2\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.679388 master-0 kubenswrapper[29612]: I0319 12:13:31.679288 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-service-ca\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.679388 master-0 kubenswrapper[29612]: I0319 12:13:31.679324 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-config\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.679388 master-0 kubenswrapper[29612]: I0319 12:13:31.679353 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-oauth-config\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.679545 master-0 kubenswrapper[29612]: I0319 12:13:31.679393 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5e63326-e038-4f5d-b621-ae10091133be-host\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.679545 master-0 kubenswrapper[29612]: I0319 12:13:31.679422 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5e63326-e038-4f5d-b621-ae10091133be-serviceca\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.679545 master-0 kubenswrapper[29612]: I0319 12:13:31.679457 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-serving-cert\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.679545 master-0 kubenswrapper[29612]: I0319 12:13:31.679492 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-oauth-serving-cert\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.780763 master-0 kubenswrapper[29612]: I0319 12:13:31.780396 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5e63326-e038-4f5d-b621-ae10091133be-host\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.780763 master-0 kubenswrapper[29612]: I0319 12:13:31.780459 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5e63326-e038-4f5d-b621-ae10091133be-serviceca\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.780763 master-0 kubenswrapper[29612]: I0319 12:13:31.780572 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f5e63326-e038-4f5d-b621-ae10091133be-host\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.780763 master-0 kubenswrapper[29612]: I0319 12:13:31.780695 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-serving-cert\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.781721 master-0 kubenswrapper[29612]: I0319 12:13:31.781450 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f5e63326-e038-4f5d-b621-ae10091133be-serviceca\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.781721 master-0 kubenswrapper[29612]: I0319 12:13:31.781510 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-oauth-serving-cert\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.781721 master-0 kubenswrapper[29612]: I0319 12:13:31.781567 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzhbp\" (UniqueName: \"kubernetes.io/projected/f5e63326-e038-4f5d-b621-ae10091133be-kube-api-access-dzhbp\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.781721 master-0 kubenswrapper[29612]: I0319 12:13:31.781678 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-trusted-ca-bundle\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.781721 master-0 kubenswrapper[29612]: I0319 12:13:31.781720 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hft2\" (UniqueName: \"kubernetes.io/projected/dbc85eb5-feb1-46f8-b04c-e419c6300b18-kube-api-access-6hft2\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.782031 master-0 kubenswrapper[29612]: I0319 12:13:31.781773 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-service-ca\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.782031 master-0 kubenswrapper[29612]: I0319 12:13:31.781830 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-config\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.782031 master-0 kubenswrapper[29612]: I0319 12:13:31.781874 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-oauth-config\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.783243 master-0 kubenswrapper[29612]: I0319 12:13:31.783202 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-config\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.783937 master-0 kubenswrapper[29612]: I0319 12:13:31.783660 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-service-ca\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.783937 master-0 kubenswrapper[29612]: I0319 12:13:31.783733 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-trusted-ca-bundle\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.784355 master-0 kubenswrapper[29612]: I0319 12:13:31.784038 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-oauth-serving-cert\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.786589 master-0 kubenswrapper[29612]: I0319 12:13:31.786552 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-oauth-config\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.788526 master-0 kubenswrapper[29612]: I0319 12:13:31.786766 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-serving-cert\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.804215 master-0 kubenswrapper[29612]: I0319 12:13:31.804158 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:13:31.808873 master-0 kubenswrapper[29612]: I0319 12:13:31.808816 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.816224 master-0 kubenswrapper[29612]: I0319 12:13:31.811929 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 12:13:31.816224 master-0 kubenswrapper[29612]: I0319 12:13:31.812142 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 12:13:31.816224 master-0 kubenswrapper[29612]: I0319 12:13:31.812890 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 12:13:31.824062 master-0 kubenswrapper[29612]: I0319 12:13:31.824015 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 12:13:31.824600 master-0 kubenswrapper[29612]: I0319 12:13:31.824575 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 12:13:31.826173 master-0 kubenswrapper[29612]: I0319 12:13:31.825171 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 12:13:31.826173 master-0 kubenswrapper[29612]: I0319 12:13:31.825322 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 12:13:31.829964 master-0 kubenswrapper[29612]: I0319 12:13:31.829620 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzhbp\" (UniqueName: \"kubernetes.io/projected/f5e63326-e038-4f5d-b621-ae10091133be-kube-api-access-dzhbp\") pod \"node-ca-49r4b\" (UID: \"f5e63326-e038-4f5d-b621-ae10091133be\") " pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.837990 master-0 kubenswrapper[29612]: I0319 12:13:31.834567 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hft2\" (UniqueName: \"kubernetes.io/projected/dbc85eb5-feb1-46f8-b04c-e419c6300b18-kube-api-access-6hft2\") pod \"console-85cf74699c-dx6hc\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.837990 master-0 kubenswrapper[29612]: I0319 12:13:31.836529 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 12:13:31.839687 master-0 kubenswrapper[29612]: I0319 12:13:31.839656 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:13:31.871853 master-0 kubenswrapper[29612]: I0319 12:13:31.871788 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-49r4b" Mar 19 12:13:31.883943 master-0 kubenswrapper[29612]: I0319 12:13:31.883888 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.883981 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1d59b018-775c-44e4-a5f0-b11fac9fae2c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884007 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d59b018-775c-44e4-a5f0-b11fac9fae2c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884052 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884071 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d59b018-775c-44e4-a5f0-b11fac9fae2c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884096 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d59b018-775c-44e4-a5f0-b11fac9fae2c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884125 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zfpn\" (UniqueName: \"kubernetes.io/projected/1d59b018-775c-44e4-a5f0-b11fac9fae2c-kube-api-access-4zfpn\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884158 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884192 master-0 kubenswrapper[29612]: I0319 12:13:31.884188 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-config-volume\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884876 master-0 kubenswrapper[29612]: I0319 12:13:31.884220 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d59b018-775c-44e4-a5f0-b11fac9fae2c-config-out\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884876 master-0 kubenswrapper[29612]: I0319 12:13:31.884246 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.884876 master-0 kubenswrapper[29612]: I0319 12:13:31.884271 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-web-config\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.888834 master-0 kubenswrapper[29612]: I0319 12:13:31.888812 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:31.943650 master-0 kubenswrapper[29612]: I0319 12:13:31.943567 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:13:31.946905 master-0 kubenswrapper[29612]: I0319 12:13:31.946858 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.949934 master-0 kubenswrapper[29612]: I0319 12:13:31.949847 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-3m82ns9hegv5s" Mar 19 12:13:31.950234 master-0 kubenswrapper[29612]: I0319 12:13:31.950217 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 12:13:31.950359 master-0 kubenswrapper[29612]: I0319 12:13:31.950341 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 12:13:31.950937 master-0 kubenswrapper[29612]: I0319 12:13:31.950457 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 12:13:31.952872 master-0 kubenswrapper[29612]: I0319 12:13:31.951224 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 12:13:31.952872 master-0 kubenswrapper[29612]: I0319 12:13:31.951444 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 12:13:31.952872 master-0 kubenswrapper[29612]: I0319 12:13:31.951563 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 12:13:31.952872 master-0 kubenswrapper[29612]: I0319 12:13:31.951671 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 12:13:31.952872 master-0 kubenswrapper[29612]: I0319 12:13:31.951767 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 12:13:31.952872 master-0 kubenswrapper[29612]: I0319 12:13:31.952409 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 12:13:31.964067 master-0 kubenswrapper[29612]: I0319 12:13:31.963875 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 12:13:31.970981 master-0 kubenswrapper[29612]: I0319 12:13:31.965291 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 12:13:31.970981 master-0 kubenswrapper[29612]: I0319 12:13:31.968159 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.984954 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06bc4932-418b-41e4-9a3c-00c8d184de32-config-out\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.984992 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d59b018-775c-44e4-a5f0-b11fac9fae2c-config-out\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985015 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985037 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985055 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-web-config\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985193 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985211 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06bc4932-418b-41e4-9a3c-00c8d184de32-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985228 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985257 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985304 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985327 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985800 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1d59b018-775c-44e4-a5f0-b11fac9fae2c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985828 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d59b018-775c-44e4-a5f0-b11fac9fae2c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985874 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985899 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985920 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d59b018-775c-44e4-a5f0-b11fac9fae2c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985941 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985960 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.985980 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d59b018-775c-44e4-a5f0-b11fac9fae2c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.986005 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq42j\" (UniqueName: \"kubernetes.io/projected/06bc4932-418b-41e4-9a3c-00c8d184de32-kube-api-access-nq42j\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.986025 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zfpn\" (UniqueName: \"kubernetes.io/projected/1d59b018-775c-44e4-a5f0-b11fac9fae2c-kube-api-access-4zfpn\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.986094 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.986159 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-config\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.986188 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.989178 master-0 kubenswrapper[29612]: I0319 12:13:31.986205 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.992436 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-web-config\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.992508 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.992543 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-config-volume\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.992560 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.992580 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.991502 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d59b018-775c-44e4-a5f0-b11fac9fae2c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.990855 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1d59b018-775c-44e4-a5f0-b11fac9fae2c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.996491 master-0 kubenswrapper[29612]: I0319 12:13:31.994385 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d59b018-775c-44e4-a5f0-b11fac9fae2c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:31.999876 master-0 kubenswrapper[29612]: I0319 12:13:31.998919 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.000415 master-0 kubenswrapper[29612]: I0319 12:13:32.000181 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-config-volume\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.001579 master-0 kubenswrapper[29612]: I0319 12:13:32.000553 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.002098 master-0 kubenswrapper[29612]: I0319 12:13:32.002050 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.003305 master-0 kubenswrapper[29612]: I0319 12:13:32.003090 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1d59b018-775c-44e4-a5f0-b11fac9fae2c-config-out\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.003305 master-0 kubenswrapper[29612]: I0319 12:13:32.003239 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-web-config\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.003854 master-0 kubenswrapper[29612]: I0319 12:13:32.003781 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1d59b018-775c-44e4-a5f0-b11fac9fae2c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.003918 master-0 kubenswrapper[29612]: I0319 12:13:32.003850 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1d59b018-775c-44e4-a5f0-b11fac9fae2c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.075893 master-0 kubenswrapper[29612]: I0319 12:13:32.072152 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zfpn\" (UniqueName: \"kubernetes.io/projected/1d59b018-775c-44e4-a5f0-b11fac9fae2c-kube-api-access-4zfpn\") pod \"alertmanager-main-0\" (UID: \"1d59b018-775c-44e4-a5f0-b11fac9fae2c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.093883 master-0 kubenswrapper[29612]: I0319 12:13:32.093824 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.093883 master-0 kubenswrapper[29612]: I0319 12:13:32.093869 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06bc4932-418b-41e4-9a3c-00c8d184de32-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094188 master-0 kubenswrapper[29612]: I0319 12:13:32.093897 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094188 master-0 kubenswrapper[29612]: I0319 12:13:32.093922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094393 master-0 kubenswrapper[29612]: I0319 12:13:32.094355 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094453 master-0 kubenswrapper[29612]: I0319 12:13:32.094432 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094506 master-0 kubenswrapper[29612]: I0319 12:13:32.094479 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094551 master-0 kubenswrapper[29612]: I0319 12:13:32.094505 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094551 master-0 kubenswrapper[29612]: I0319 12:13:32.094537 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq42j\" (UniqueName: \"kubernetes.io/projected/06bc4932-418b-41e4-9a3c-00c8d184de32-kube-api-access-nq42j\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094662 master-0 kubenswrapper[29612]: I0319 12:13:32.094576 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094662 master-0 kubenswrapper[29612]: I0319 12:13:32.094608 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-config\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094757 master-0 kubenswrapper[29612]: I0319 12:13:32.094664 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094757 master-0 kubenswrapper[29612]: I0319 12:13:32.094690 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-web-config\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094757 master-0 kubenswrapper[29612]: I0319 12:13:32.094734 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094878 master-0 kubenswrapper[29612]: I0319 12:13:32.094761 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094878 master-0 kubenswrapper[29612]: I0319 12:13:32.094784 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094878 master-0 kubenswrapper[29612]: I0319 12:13:32.094810 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06bc4932-418b-41e4-9a3c-00c8d184de32-config-out\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.094878 master-0 kubenswrapper[29612]: I0319 12:13:32.094836 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.095315 master-0 kubenswrapper[29612]: I0319 12:13:32.095272 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.097032 master-0 kubenswrapper[29612]: I0319 12:13:32.096977 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.097284 master-0 kubenswrapper[29612]: I0319 12:13:32.097249 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.103445 master-0 kubenswrapper[29612]: I0319 12:13:32.098458 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.107757 master-0 kubenswrapper[29612]: I0319 12:13:32.107339 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.108185 master-0 kubenswrapper[29612]: I0319 12:13:32.107759 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-web-config\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.108185 master-0 kubenswrapper[29612]: I0319 12:13:32.107847 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.108185 master-0 kubenswrapper[29612]: I0319 12:13:32.107931 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.108358 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.109083 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.109624 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/06bc4932-418b-41e4-9a3c-00c8d184de32-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.109752 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.109916 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-config\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.110175 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/06bc4932-418b-41e4-9a3c-00c8d184de32-config-out\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.110607 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.120514 master-0 kubenswrapper[29612]: I0319 12:13:32.116721 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/06bc4932-418b-41e4-9a3c-00c8d184de32-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.136376 master-0 kubenswrapper[29612]: I0319 12:13:32.135053 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/06bc4932-418b-41e4-9a3c-00c8d184de32-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.153722 master-0 kubenswrapper[29612]: I0319 12:13:32.153480 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq42j\" (UniqueName: \"kubernetes.io/projected/06bc4932-418b-41e4-9a3c-00c8d184de32-kube-api-access-nq42j\") pod \"prometheus-k8s-0\" (UID: \"06bc4932-418b-41e4-9a3c-00c8d184de32\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.170202 master-0 kubenswrapper[29612]: I0319 12:13:32.170147 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 12:13:32.394618 master-0 kubenswrapper[29612]: I0319 12:13:32.394545 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:32.431659 master-0 kubenswrapper[29612]: I0319 12:13:32.431185 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85cf74699c-dx6hc"] Mar 19 12:13:32.436918 master-0 kubenswrapper[29612]: I0319 12:13:32.436732 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9966f847b-gzmrl_493d3830-b0a6-4a3a-8048-0a6aed134ebf/console/0.log" Mar 19 12:13:32.436918 master-0 kubenswrapper[29612]: I0319 12:13:32.436778 29612 generic.go:334] "Generic (PLEG): container finished" podID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerID="cc1debbf2eb1964f475500d179ac1133f88154be5dc3f3a5984ba2d33ce83143" exitCode=255 Mar 19 12:13:32.436918 master-0 kubenswrapper[29612]: I0319 12:13:32.436852 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9966f847b-gzmrl" event={"ID":"493d3830-b0a6-4a3a-8048-0a6aed134ebf","Type":"ContainerDied","Data":"cc1debbf2eb1964f475500d179ac1133f88154be5dc3f3a5984ba2d33ce83143"} Mar 19 12:13:32.440655 master-0 kubenswrapper[29612]: I0319 12:13:32.437261 29612 scope.go:117] "RemoveContainer" containerID="cc1debbf2eb1964f475500d179ac1133f88154be5dc3f3a5984ba2d33ce83143" Mar 19 12:13:32.440655 master-0 kubenswrapper[29612]: I0319 12:13:32.439079 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c867fd8bf-nt5cs_7a14879e-b284-42e9-aea1-818ff991f2d8/console/0.log" Mar 19 12:13:32.440655 master-0 kubenswrapper[29612]: I0319 12:13:32.439115 29612 generic.go:334] "Generic (PLEG): container finished" podID="7a14879e-b284-42e9-aea1-818ff991f2d8" containerID="ab0e6b929fba63aab45a3639a360b9b9fde7f20526c6965e527636f8add26ddb" exitCode=255 Mar 19 12:13:32.440655 master-0 kubenswrapper[29612]: I0319 12:13:32.439158 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c867fd8bf-nt5cs" event={"ID":"7a14879e-b284-42e9-aea1-818ff991f2d8","Type":"ContainerDied","Data":"ab0e6b929fba63aab45a3639a360b9b9fde7f20526c6965e527636f8add26ddb"} Mar 19 12:13:32.444860 master-0 kubenswrapper[29612]: W0319 12:13:32.443795 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbc85eb5_feb1_46f8_b04c_e419c6300b18.slice/crio-1e9ac01b466865efb385598ff516c93cd1e57e480992d03db6af5290d63f6658 WatchSource:0}: Error finding container 1e9ac01b466865efb385598ff516c93cd1e57e480992d03db6af5290d63f6658: Status 404 returned error can't find the container with id 1e9ac01b466865efb385598ff516c93cd1e57e480992d03db6af5290d63f6658 Mar 19 12:13:32.445691 master-0 kubenswrapper[29612]: I0319 12:13:32.445600 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"578f0a219e6c6c2dc3bc70c4e84a8aa1df5747d11944d9e1373ab27d2f5ab6c9"} Mar 19 12:13:32.445746 master-0 kubenswrapper[29612]: I0319 12:13:32.445697 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"9e79200a5c0212d80797d7470155d582625016c053e3d7ecbd003c26afe7abf4"} Mar 19 12:13:32.447918 master-0 kubenswrapper[29612]: I0319 12:13:32.447885 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-49r4b" event={"ID":"f5e63326-e038-4f5d-b621-ae10091133be","Type":"ContainerStarted","Data":"b9036994e050e540fa1626db1ab4deea7ff8e274407dd105653af04b0bb79107"} Mar 19 12:13:32.547969 master-0 kubenswrapper[29612]: I0319 12:13:32.547305 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:32.547969 master-0 kubenswrapper[29612]: I0319 12:13:32.547452 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:32.614771 master-0 kubenswrapper[29612]: I0319 12:13:32.610171 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 12:13:32.632336 master-0 kubenswrapper[29612]: W0319 12:13:32.631430 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d59b018_775c_44e4_a5f0_b11fac9fae2c.slice/crio-dbdc2873391faf963a1e5e6d59f1565dd1f8dfda0c3a7f6c60117009bd3c4ab9 WatchSource:0}: Error finding container dbdc2873391faf963a1e5e6d59f1565dd1f8dfda0c3a7f6c60117009bd3c4ab9: Status 404 returned error can't find the container with id dbdc2873391faf963a1e5e6d59f1565dd1f8dfda0c3a7f6c60117009bd3c4ab9 Mar 19 12:13:32.817554 master-0 kubenswrapper[29612]: I0319 12:13:32.817166 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c867fd8bf-nt5cs_7a14879e-b284-42e9-aea1-818ff991f2d8/console/0.log" Mar 19 12:13:32.817554 master-0 kubenswrapper[29612]: I0319 12:13:32.817249 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:13:32.914021 master-0 kubenswrapper[29612]: I0319 12:13:32.913970 29612 scope.go:117] "RemoveContainer" containerID="428d6d7d95bed80e9283da6ff729941f2ee9b7a5398fd1877670023da3a806e6" Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.914548 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctbzp\" (UniqueName: \"kubernetes.io/projected/7a14879e-b284-42e9-aea1-818ff991f2d8-kube-api-access-ctbzp\") pod \"7a14879e-b284-42e9-aea1-818ff991f2d8\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.914581 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") pod \"7a14879e-b284-42e9-aea1-818ff991f2d8\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.914788 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-oauth-serving-cert\") pod \"7a14879e-b284-42e9-aea1-818ff991f2d8\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.914820 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-console-config\") pod \"7a14879e-b284-42e9-aea1-818ff991f2d8\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.914840 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-service-ca\") pod \"7a14879e-b284-42e9-aea1-818ff991f2d8\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.914881 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-oauth-config\") pod \"7a14879e-b284-42e9-aea1-818ff991f2d8\" (UID: \"7a14879e-b284-42e9-aea1-818ff991f2d8\") " Mar 19 12:13:32.915710 master-0 kubenswrapper[29612]: I0319 12:13:32.915314 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7a14879e-b284-42e9-aea1-818ff991f2d8" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:32.918586 master-0 kubenswrapper[29612]: I0319 12:13:32.918257 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7a14879e-b284-42e9-aea1-818ff991f2d8" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:32.918586 master-0 kubenswrapper[29612]: I0319 12:13:32.918502 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-service-ca" (OuterVolumeSpecName: "service-ca") pod "7a14879e-b284-42e9-aea1-818ff991f2d8" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:32.918712 master-0 kubenswrapper[29612]: I0319 12:13:32.918660 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7a14879e-b284-42e9-aea1-818ff991f2d8" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:32.922684 master-0 kubenswrapper[29612]: I0319 12:13:32.919043 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-console-config" (OuterVolumeSpecName: "console-config") pod "7a14879e-b284-42e9-aea1-818ff991f2d8" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:32.922684 master-0 kubenswrapper[29612]: I0319 12:13:32.919490 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a14879e-b284-42e9-aea1-818ff991f2d8-kube-api-access-ctbzp" (OuterVolumeSpecName: "kube-api-access-ctbzp") pod "7a14879e-b284-42e9-aea1-818ff991f2d8" (UID: "7a14879e-b284-42e9-aea1-818ff991f2d8"). InnerVolumeSpecName "kube-api-access-ctbzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:13:32.994061 master-0 kubenswrapper[29612]: I0319 12:13:32.993971 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 12:13:32.996961 master-0 kubenswrapper[29612]: W0319 12:13:32.996709 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06bc4932_418b_41e4_9a3c_00c8d184de32.slice/crio-030125b4116df4e3f7455926062c1c0df09a825013102a3311f0a5b4d0c4461b WatchSource:0}: Error finding container 030125b4116df4e3f7455926062c1c0df09a825013102a3311f0a5b4d0c4461b: Status 404 returned error can't find the container with id 030125b4116df4e3f7455926062c1c0df09a825013102a3311f0a5b4d0c4461b Mar 19 12:13:33.017136 master-0 kubenswrapper[29612]: I0319 12:13:33.017088 29612 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:33.017136 master-0 kubenswrapper[29612]: I0319 12:13:33.017132 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctbzp\" (UniqueName: \"kubernetes.io/projected/7a14879e-b284-42e9-aea1-818ff991f2d8-kube-api-access-ctbzp\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:33.017136 master-0 kubenswrapper[29612]: I0319 12:13:33.017142 29612 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a14879e-b284-42e9-aea1-818ff991f2d8-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:33.017136 master-0 kubenswrapper[29612]: I0319 12:13:33.017152 29612 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:33.017411 master-0 kubenswrapper[29612]: I0319 12:13:33.017164 29612 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:33.017411 master-0 kubenswrapper[29612]: I0319 12:13:33.017173 29612 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a14879e-b284-42e9-aea1-818ff991f2d8-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:33.455822 master-0 kubenswrapper[29612]: I0319 12:13:33.455754 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85cf74699c-dx6hc" event={"ID":"dbc85eb5-feb1-46f8-b04c-e419c6300b18","Type":"ContainerStarted","Data":"9842570da6fc9209f93eeff83a6ad10196416f4ac3db591886b8075cef88bfb8"} Mar 19 12:13:33.455822 master-0 kubenswrapper[29612]: I0319 12:13:33.455819 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85cf74699c-dx6hc" event={"ID":"dbc85eb5-feb1-46f8-b04c-e419c6300b18","Type":"ContainerStarted","Data":"1e9ac01b466865efb385598ff516c93cd1e57e480992d03db6af5290d63f6658"} Mar 19 12:13:33.459963 master-0 kubenswrapper[29612]: I0319 12:13:33.459828 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"e1afabeebd32e7f30fc6c31140ced21aff13386109f965bf4968cd984a6c29b4"} Mar 19 12:13:33.459963 master-0 kubenswrapper[29612]: I0319 12:13:33.459908 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"030125b4116df4e3f7455926062c1c0df09a825013102a3311f0a5b4d0c4461b"} Mar 19 12:13:33.462565 master-0 kubenswrapper[29612]: I0319 12:13:33.462526 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9966f847b-gzmrl_493d3830-b0a6-4a3a-8048-0a6aed134ebf/console/0.log" Mar 19 12:13:33.462758 master-0 kubenswrapper[29612]: I0319 12:13:33.462611 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9966f847b-gzmrl" event={"ID":"493d3830-b0a6-4a3a-8048-0a6aed134ebf","Type":"ContainerStarted","Data":"106e6e717e0a07a215d277ed863ea494c396b3dcc8b25dc533922967305fec0a"} Mar 19 12:13:33.469705 master-0 kubenswrapper[29612]: I0319 12:13:33.469665 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c867fd8bf-nt5cs_7a14879e-b284-42e9-aea1-818ff991f2d8/console/0.log" Mar 19 12:13:33.469894 master-0 kubenswrapper[29612]: I0319 12:13:33.469818 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c867fd8bf-nt5cs" Mar 19 12:13:33.470421 master-0 kubenswrapper[29612]: I0319 12:13:33.470387 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c867fd8bf-nt5cs" event={"ID":"7a14879e-b284-42e9-aea1-818ff991f2d8","Type":"ContainerDied","Data":"c71e8c9d0e96d387f0effdeef0c1eaf1190bfb1f3bf31f89ea702dd75edbb251"} Mar 19 12:13:33.470493 master-0 kubenswrapper[29612]: I0319 12:13:33.470433 29612 scope.go:117] "RemoveContainer" containerID="ab0e6b929fba63aab45a3639a360b9b9fde7f20526c6965e527636f8add26ddb" Mar 19 12:13:33.475513 master-0 kubenswrapper[29612]: I0319 12:13:33.475449 29612 generic.go:334] "Generic (PLEG): container finished" podID="1d59b018-775c-44e4-a5f0-b11fac9fae2c" containerID="1bdc76e899b07db340f5df0ca9da845a1793de7356db1ac5bfeb59e65f74e552" exitCode=0 Mar 19 12:13:33.475651 master-0 kubenswrapper[29612]: I0319 12:13:33.475580 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerDied","Data":"1bdc76e899b07db340f5df0ca9da845a1793de7356db1ac5bfeb59e65f74e552"} Mar 19 12:13:33.475651 master-0 kubenswrapper[29612]: I0319 12:13:33.475612 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"dbdc2873391faf963a1e5e6d59f1565dd1f8dfda0c3a7f6c60117009bd3c4ab9"} Mar 19 12:13:33.480881 master-0 kubenswrapper[29612]: I0319 12:13:33.480386 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5b685db9f8-w7wzn_3b939bd4-2c55-4819-a6ef-1a79f91d49a7/telemeter-client/1.log" Mar 19 12:13:33.483684 master-0 kubenswrapper[29612]: I0319 12:13:33.481397 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" event={"ID":"3b939bd4-2c55-4819-a6ef-1a79f91d49a7","Type":"ContainerStarted","Data":"e775e91f0f9841669d2b07130ee2164e93ccaef98e7d71e295a26f1a2f83c5e2"} Mar 19 12:13:33.531484 master-0 kubenswrapper[29612]: I0319 12:13:33.531369 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85cf74699c-dx6hc" podStartSLOduration=2.531352959 podStartE2EDuration="2.531352959s" podCreationTimestamp="2026-03-19 12:13:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:13:33.527835489 +0000 UTC m=+220.842004520" watchObservedRunningTime="2026-03-19 12:13:33.531352959 +0000 UTC m=+220.845521980" Mar 19 12:13:33.566183 master-0 kubenswrapper[29612]: I0319 12:13:33.566098 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c867fd8bf-nt5cs"] Mar 19 12:13:33.577748 master-0 kubenswrapper[29612]: I0319 12:13:33.577592 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c867fd8bf-nt5cs"] Mar 19 12:13:33.599144 master-0 kubenswrapper[29612]: I0319 12:13:33.599059 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5b685db9f8-w7wzn" podStartSLOduration=21.599034098 podStartE2EDuration="21.599034098s" podCreationTimestamp="2026-03-19 12:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:13:33.592216404 +0000 UTC m=+220.906385425" watchObservedRunningTime="2026-03-19 12:13:33.599034098 +0000 UTC m=+220.913203119" Mar 19 12:13:33.699043 master-0 kubenswrapper[29612]: I0319 12:13:33.698961 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9966f847b-gzmrl" podStartSLOduration=36.363311868 podStartE2EDuration="50.698940687s" podCreationTimestamp="2026-03-19 12:12:43 +0000 UTC" firstStartedPulling="2026-03-19 12:13:16.993741115 +0000 UTC m=+204.307910136" lastFinishedPulling="2026-03-19 12:13:31.329369934 +0000 UTC m=+218.643538955" observedRunningTime="2026-03-19 12:13:33.692862483 +0000 UTC m=+221.007031504" watchObservedRunningTime="2026-03-19 12:13:33.698940687 +0000 UTC m=+221.013109698" Mar 19 12:13:34.493724 master-0 kubenswrapper[29612]: I0319 12:13:34.493661 29612 generic.go:334] "Generic (PLEG): container finished" podID="06bc4932-418b-41e4-9a3c-00c8d184de32" containerID="e1afabeebd32e7f30fc6c31140ced21aff13386109f965bf4968cd984a6c29b4" exitCode=0 Mar 19 12:13:34.495593 master-0 kubenswrapper[29612]: I0319 12:13:34.493733 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerDied","Data":"e1afabeebd32e7f30fc6c31140ced21aff13386109f965bf4968cd984a6c29b4"} Mar 19 12:13:34.499735 master-0 kubenswrapper[29612]: I0319 12:13:34.499679 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"260f8373bc1682c6f30321f92f47af8bdf91e601087fcb0250771141a4af3bad"} Mar 19 12:13:34.499805 master-0 kubenswrapper[29612]: I0319 12:13:34.499739 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"adee25d3ff1211160f2cb668129b3f363668915cf731271b5fbc11f32718559e"} Mar 19 12:13:34.915426 master-0 kubenswrapper[29612]: I0319 12:13:34.915366 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a14879e-b284-42e9-aea1-818ff991f2d8" path="/var/lib/kubelet/pods/7a14879e-b284-42e9-aea1-818ff991f2d8/volumes" Mar 19 12:13:36.027431 master-0 kubenswrapper[29612]: I0319 12:13:36.027291 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:13:36.027431 master-0 kubenswrapper[29612]: I0319 12:13:36.027367 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:13:36.030745 master-0 kubenswrapper[29612]: I0319 12:13:36.030670 29612 patch_prober.go:28] interesting pod/console-9966f847b-gzmrl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.99:8443/health\": dial tcp 10.128.0.99:8443: connect: connection refused" start-of-body= Mar 19 12:13:36.030894 master-0 kubenswrapper[29612]: I0319 12:13:36.030748 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-9966f847b-gzmrl" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.99:8443/health\": dial tcp 10.128.0.99:8443: connect: connection refused" Mar 19 12:13:37.523265 master-0 kubenswrapper[29612]: I0319 12:13:37.523189 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" event={"ID":"ba84400d-9694-4ea6-87f3-ea29910b8904","Type":"ContainerStarted","Data":"40dd12361b7df3ace0aa0bef21f78ae1e4498559b22cb98b5befe34f4ad0de76"} Mar 19 12:13:38.529075 master-0 kubenswrapper[29612]: I0319 12:13:38.529027 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:38.536854 master-0 kubenswrapper[29612]: I0319 12:13:38.536813 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" Mar 19 12:13:40.157850 master-0 kubenswrapper[29612]: I0319 12:13:40.154187 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-59bc46f698-rsj74" podStartSLOduration=14.713889754 podStartE2EDuration="31.154161444s" podCreationTimestamp="2026-03-19 12:13:09 +0000 UTC" firstStartedPulling="2026-03-19 12:13:16.886451155 +0000 UTC m=+204.200620176" lastFinishedPulling="2026-03-19 12:13:33.326722845 +0000 UTC m=+220.640891866" observedRunningTime="2026-03-19 12:13:40.148690898 +0000 UTC m=+227.462859949" watchObservedRunningTime="2026-03-19 12:13:40.154161444 +0000 UTC m=+227.468330475" Mar 19 12:13:40.368102 master-0 kubenswrapper[29612]: I0319 12:13:40.366400 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9966f847b-gzmrl"] Mar 19 12:13:40.466548 master-0 kubenswrapper[29612]: I0319 12:13:40.466402 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bfc49dbf9-mppbq"] Mar 19 12:13:40.466856 master-0 kubenswrapper[29612]: E0319 12:13:40.466818 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a14879e-b284-42e9-aea1-818ff991f2d8" containerName="console" Mar 19 12:13:40.466856 master-0 kubenswrapper[29612]: I0319 12:13:40.466844 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a14879e-b284-42e9-aea1-818ff991f2d8" containerName="console" Mar 19 12:13:40.467427 master-0 kubenswrapper[29612]: I0319 12:13:40.467083 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a14879e-b284-42e9-aea1-818ff991f2d8" containerName="console" Mar 19 12:13:40.467751 master-0 kubenswrapper[29612]: I0319 12:13:40.467687 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.489347 master-0 kubenswrapper[29612]: I0319 12:13:40.489263 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bfc49dbf9-mppbq"] Mar 19 12:13:40.563379 master-0 kubenswrapper[29612]: I0319 12:13:40.563292 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-oauth-config\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.564170 master-0 kubenswrapper[29612]: I0319 12:13:40.563718 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-oauth-serving-cert\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.564170 master-0 kubenswrapper[29612]: I0319 12:13:40.563913 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-trusted-ca-bundle\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.564170 master-0 kubenswrapper[29612]: I0319 12:13:40.564050 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-serving-cert\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.564170 master-0 kubenswrapper[29612]: I0319 12:13:40.564125 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-service-ca\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.564465 master-0 kubenswrapper[29612]: I0319 12:13:40.564239 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw9p9\" (UniqueName: \"kubernetes.io/projected/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-kube-api-access-kw9p9\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.564465 master-0 kubenswrapper[29612]: I0319 12:13:40.564381 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-config\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.666516 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-serving-cert\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.666597 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-service-ca\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.666627 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw9p9\" (UniqueName: \"kubernetes.io/projected/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-kube-api-access-kw9p9\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.666763 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-config\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.666854 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-oauth-config\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.666922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-oauth-serving-cert\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.667091 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-trusted-ca-bundle\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.668034 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-service-ca\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.669403 master-0 kubenswrapper[29612]: I0319 12:13:40.668976 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-config\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.670302 master-0 kubenswrapper[29612]: I0319 12:13:40.670065 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-oauth-serving-cert\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.671542 master-0 kubenswrapper[29612]: I0319 12:13:40.671488 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-trusted-ca-bundle\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.671857 master-0 kubenswrapper[29612]: I0319 12:13:40.671832 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-oauth-config\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.672374 master-0 kubenswrapper[29612]: I0319 12:13:40.672286 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-serving-cert\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.698367 master-0 kubenswrapper[29612]: I0319 12:13:40.698053 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw9p9\" (UniqueName: \"kubernetes.io/projected/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-kube-api-access-kw9p9\") pod \"console-bfc49dbf9-mppbq\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:40.813409 master-0 kubenswrapper[29612]: I0319 12:13:40.812889 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:41.265546 master-0 kubenswrapper[29612]: I0319 12:13:41.265441 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bfc49dbf9-mppbq"] Mar 19 12:13:41.555098 master-0 kubenswrapper[29612]: I0319 12:13:41.554966 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfc49dbf9-mppbq" event={"ID":"9059b576-a287-4e13-9ae3-bb6ff70bc5b9","Type":"ContainerStarted","Data":"970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40"} Mar 19 12:13:41.555098 master-0 kubenswrapper[29612]: I0319 12:13:41.555029 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfc49dbf9-mppbq" event={"ID":"9059b576-a287-4e13-9ae3-bb6ff70bc5b9","Type":"ContainerStarted","Data":"8cd7f53bd5210c62b82fd304e867fc2e55c645afc3b1f06b30acfcf2ae75aacd"} Mar 19 12:13:41.557743 master-0 kubenswrapper[29612]: I0319 12:13:41.557687 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-49r4b" event={"ID":"f5e63326-e038-4f5d-b621-ae10091133be","Type":"ContainerStarted","Data":"22e3fffb3089d1a9ff496a6559d72588d701dbdd328747dc8cd2b37e71bff473"} Mar 19 12:13:41.585032 master-0 kubenswrapper[29612]: I0319 12:13:41.584956 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bfc49dbf9-mppbq" podStartSLOduration=1.584907238 podStartE2EDuration="1.584907238s" podCreationTimestamp="2026-03-19 12:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:13:41.582826118 +0000 UTC m=+228.896995139" watchObservedRunningTime="2026-03-19 12:13:41.584907238 +0000 UTC m=+228.899076259" Mar 19 12:13:41.602372 master-0 kubenswrapper[29612]: I0319 12:13:41.602298 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-49r4b" podStartSLOduration=2.437008848 podStartE2EDuration="10.602282073s" podCreationTimestamp="2026-03-19 12:13:31 +0000 UTC" firstStartedPulling="2026-03-19 12:13:31.984192875 +0000 UTC m=+219.298361896" lastFinishedPulling="2026-03-19 12:13:40.1494661 +0000 UTC m=+227.463635121" observedRunningTime="2026-03-19 12:13:41.599159994 +0000 UTC m=+228.913329015" watchObservedRunningTime="2026-03-19 12:13:41.602282073 +0000 UTC m=+228.916451084" Mar 19 12:13:41.890670 master-0 kubenswrapper[29612]: I0319 12:13:41.890491 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:41.890670 master-0 kubenswrapper[29612]: I0319 12:13:41.890552 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:13:41.893743 master-0 kubenswrapper[29612]: I0319 12:13:41.893672 29612 patch_prober.go:28] interesting pod/console-85cf74699c-dx6hc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Mar 19 12:13:41.893909 master-0 kubenswrapper[29612]: I0319 12:13:41.893761 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Mar 19 12:13:42.252479 master-0 kubenswrapper[29612]: I0319 12:13:42.249687 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" podUID="4560e0cf-d401-4508-a8d8-3869ed30ed18" containerName="oauth-openshift" containerID="cri-o://1971660b280bd0eb25c9f689cee9cc750f6518614ee8ed2aed577ed6c0ed4b85" gracePeriod=15 Mar 19 12:13:42.567430 master-0 kubenswrapper[29612]: I0319 12:13:42.567242 29612 generic.go:334] "Generic (PLEG): container finished" podID="4560e0cf-d401-4508-a8d8-3869ed30ed18" containerID="1971660b280bd0eb25c9f689cee9cc750f6518614ee8ed2aed577ed6c0ed4b85" exitCode=0 Mar 19 12:13:42.567430 master-0 kubenswrapper[29612]: I0319 12:13:42.567297 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" event={"ID":"4560e0cf-d401-4508-a8d8-3869ed30ed18","Type":"ContainerDied","Data":"1971660b280bd0eb25c9f689cee9cc750f6518614ee8ed2aed577ed6c0ed4b85"} Mar 19 12:13:45.354408 master-0 kubenswrapper[29612]: I0319 12:13:45.354154 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:13:45.454049 master-0 kubenswrapper[29612]: I0319 12:13:45.453951 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-serving-cert\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454049 master-0 kubenswrapper[29612]: I0319 12:13:45.454048 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-policies\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454114 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-login\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454152 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454179 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-router-certs\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454199 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-service-ca\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454215 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454234 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-dir\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454261 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-provider-selection\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454341 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjpnl\" (UniqueName: \"kubernetes.io/projected/4560e0cf-d401-4508-a8d8-3869ed30ed18-kube-api-access-zjpnl\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454377 master-0 kubenswrapper[29612]: I0319 12:13:45.454374 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-error\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454828 master-0 kubenswrapper[29612]: I0319 12:13:45.454414 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-trusted-ca-bundle\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.454828 master-0 kubenswrapper[29612]: I0319 12:13:45.454452 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-ocp-branding-template\") pod \"4560e0cf-d401-4508-a8d8-3869ed30ed18\" (UID: \"4560e0cf-d401-4508-a8d8-3869ed30ed18\") " Mar 19 12:13:45.455023 master-0 kubenswrapper[29612]: I0319 12:13:45.454966 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:45.455089 master-0 kubenswrapper[29612]: I0319 12:13:45.454972 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:45.455132 master-0 kubenswrapper[29612]: I0319 12:13:45.455069 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:13:45.455623 master-0 kubenswrapper[29612]: I0319 12:13:45.455590 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:45.455900 master-0 kubenswrapper[29612]: I0319 12:13:45.455837 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:13:45.457588 master-0 kubenswrapper[29612]: I0319 12:13:45.457551 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.457696 master-0 kubenswrapper[29612]: I0319 12:13:45.457595 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.458058 master-0 kubenswrapper[29612]: I0319 12:13:45.458018 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.458216 master-0 kubenswrapper[29612]: I0319 12:13:45.458146 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.458216 master-0 kubenswrapper[29612]: I0319 12:13:45.458189 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.458737 master-0 kubenswrapper[29612]: I0319 12:13:45.458690 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.458966 master-0 kubenswrapper[29612]: I0319 12:13:45.458914 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:13:45.459021 master-0 kubenswrapper[29612]: I0319 12:13:45.458987 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4560e0cf-d401-4508-a8d8-3869ed30ed18-kube-api-access-zjpnl" (OuterVolumeSpecName: "kube-api-access-zjpnl") pod "4560e0cf-d401-4508-a8d8-3869ed30ed18" (UID: "4560e0cf-d401-4508-a8d8-3869ed30ed18"). InnerVolumeSpecName "kube-api-access-zjpnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556034 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556086 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556105 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556116 29612 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556128 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556140 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556157 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556175 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556180 master-0 kubenswrapper[29612]: I0319 12:13:45.556188 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556631 master-0 kubenswrapper[29612]: I0319 12:13:45.556201 29612 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4560e0cf-d401-4508-a8d8-3869ed30ed18-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556631 master-0 kubenswrapper[29612]: I0319 12:13:45.556212 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556631 master-0 kubenswrapper[29612]: I0319 12:13:45.556221 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjpnl\" (UniqueName: \"kubernetes.io/projected/4560e0cf-d401-4508-a8d8-3869ed30ed18-kube-api-access-zjpnl\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.556631 master-0 kubenswrapper[29612]: I0319 12:13:45.556233 29612 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4560e0cf-d401-4508-a8d8-3869ed30ed18-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 19 12:13:45.589627 master-0 kubenswrapper[29612]: I0319 12:13:45.589569 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" event={"ID":"4560e0cf-d401-4508-a8d8-3869ed30ed18","Type":"ContainerDied","Data":"91f80a13c9cca1e264585bceb12112c9de2785e238c83ec620b4cf692f9cd4a5"} Mar 19 12:13:45.589627 master-0 kubenswrapper[29612]: I0319 12:13:45.589593 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc697b97-m8zhf" Mar 19 12:13:45.589627 master-0 kubenswrapper[29612]: I0319 12:13:45.589652 29612 scope.go:117] "RemoveContainer" containerID="1971660b280bd0eb25c9f689cee9cc750f6518614ee8ed2aed577ed6c0ed4b85" Mar 19 12:13:45.592162 master-0 kubenswrapper[29612]: I0319 12:13:45.592114 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"83e6130e3872356e35cddc9754d2201ae28c5d3e27aa04636dacd679b8361596"} Mar 19 12:13:46.604966 master-0 kubenswrapper[29612]: I0319 12:13:46.604910 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"486208f2a2382a4327ea201f4044f0ca574ab3d6dd44920aaa0e67ae162bbcf0"} Mar 19 12:13:47.142037 master-0 kubenswrapper[29612]: I0319 12:13:47.141707 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-54d6758664-htjsj"] Mar 19 12:13:47.142192 master-0 kubenswrapper[29612]: E0319 12:13:47.142152 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4560e0cf-d401-4508-a8d8-3869ed30ed18" containerName="oauth-openshift" Mar 19 12:13:47.142192 master-0 kubenswrapper[29612]: I0319 12:13:47.142174 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="4560e0cf-d401-4508-a8d8-3869ed30ed18" containerName="oauth-openshift" Mar 19 12:13:47.142499 master-0 kubenswrapper[29612]: I0319 12:13:47.142473 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="4560e0cf-d401-4508-a8d8-3869ed30ed18" containerName="oauth-openshift" Mar 19 12:13:47.144711 master-0 kubenswrapper[29612]: I0319 12:13:47.143095 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.147744 master-0 kubenswrapper[29612]: I0319 12:13:47.146862 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 12:13:47.153808 master-0 kubenswrapper[29612]: I0319 12:13:47.152957 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 12:13:47.153808 master-0 kubenswrapper[29612]: I0319 12:13:47.153364 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 12:13:47.159700 master-0 kubenswrapper[29612]: I0319 12:13:47.159471 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 12:13:47.159700 master-0 kubenswrapper[29612]: I0319 12:13:47.159525 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-nbxfb" Mar 19 12:13:47.159973 master-0 kubenswrapper[29612]: I0319 12:13:47.159843 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 12:13:47.159973 master-0 kubenswrapper[29612]: I0319 12:13:47.159950 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 12:13:47.162050 master-0 kubenswrapper[29612]: I0319 12:13:47.160332 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 12:13:47.162050 master-0 kubenswrapper[29612]: I0319 12:13:47.160722 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 12:13:47.165064 master-0 kubenswrapper[29612]: I0319 12:13:47.164924 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 12:13:47.168225 master-0 kubenswrapper[29612]: I0319 12:13:47.168177 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 12:13:47.168366 master-0 kubenswrapper[29612]: I0319 12:13:47.168198 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 12:13:47.170951 master-0 kubenswrapper[29612]: I0319 12:13:47.170413 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 12:13:47.173795 master-0 kubenswrapper[29612]: I0319 12:13:47.173709 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 12:13:47.179090 master-0 kubenswrapper[29612]: I0319 12:13:47.179044 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv2j\" (UniqueName: \"kubernetes.io/projected/d18704d6-f988-4694-9cfd-a0945d8d246b-kube-api-access-brv2j\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179176 master-0 kubenswrapper[29612]: I0319 12:13:47.179102 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179176 master-0 kubenswrapper[29612]: I0319 12:13:47.179130 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-login\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179176 master-0 kubenswrapper[29612]: I0319 12:13:47.179154 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-audit-policies\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179376 master-0 kubenswrapper[29612]: I0319 12:13:47.179208 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d18704d6-f988-4694-9cfd-a0945d8d246b-audit-dir\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179376 master-0 kubenswrapper[29612]: I0319 12:13:47.179226 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-session\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179376 master-0 kubenswrapper[29612]: I0319 12:13:47.179248 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179376 master-0 kubenswrapper[29612]: I0319 12:13:47.179271 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179376 master-0 kubenswrapper[29612]: I0319 12:13:47.179300 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179703 master-0 kubenswrapper[29612]: I0319 12:13:47.179428 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-router-certs\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179703 master-0 kubenswrapper[29612]: I0319 12:13:47.179501 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179703 master-0 kubenswrapper[29612]: I0319 12:13:47.179535 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-service-ca\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.179703 master-0 kubenswrapper[29612]: I0319 12:13:47.179685 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-error\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.281663 master-0 kubenswrapper[29612]: I0319 12:13:47.281565 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.281953 master-0 kubenswrapper[29612]: I0319 12:13:47.281671 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-router-certs\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.281953 master-0 kubenswrapper[29612]: I0319 12:13:47.281734 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.281953 master-0 kubenswrapper[29612]: I0319 12:13:47.281774 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-service-ca\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.281953 master-0 kubenswrapper[29612]: I0319 12:13:47.281837 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-error\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.281953 master-0 kubenswrapper[29612]: I0319 12:13:47.281876 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv2j\" (UniqueName: \"kubernetes.io/projected/d18704d6-f988-4694-9cfd-a0945d8d246b-kube-api-access-brv2j\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282170 master-0 kubenswrapper[29612]: I0319 12:13:47.282019 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282170 master-0 kubenswrapper[29612]: I0319 12:13:47.282052 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-login\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282170 master-0 kubenswrapper[29612]: I0319 12:13:47.282087 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-audit-policies\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282296 master-0 kubenswrapper[29612]: I0319 12:13:47.282116 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d18704d6-f988-4694-9cfd-a0945d8d246b-audit-dir\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282296 master-0 kubenswrapper[29612]: I0319 12:13:47.282232 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-session\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282296 master-0 kubenswrapper[29612]: I0319 12:13:47.282264 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.282412 master-0 kubenswrapper[29612]: I0319 12:13:47.282295 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.283684 master-0 kubenswrapper[29612]: I0319 12:13:47.282719 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-service-ca\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.283684 master-0 kubenswrapper[29612]: I0319 12:13:47.283119 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d18704d6-f988-4694-9cfd-a0945d8d246b-audit-dir\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.283684 master-0 kubenswrapper[29612]: I0319 12:13:47.283124 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-audit-policies\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.283684 master-0 kubenswrapper[29612]: I0319 12:13:47.283463 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.284210 master-0 kubenswrapper[29612]: I0319 12:13:47.284165 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.285990 master-0 kubenswrapper[29612]: I0319 12:13:47.285942 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-error\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.286058 master-0 kubenswrapper[29612]: I0319 12:13:47.286028 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.286214 master-0 kubenswrapper[29612]: I0319 12:13:47.286179 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.286518 master-0 kubenswrapper[29612]: I0319 12:13:47.286477 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-router-certs\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.286572 master-0 kubenswrapper[29612]: I0319 12:13:47.286517 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-session\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.287905 master-0 kubenswrapper[29612]: I0319 12:13:47.287883 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.288272 master-0 kubenswrapper[29612]: I0319 12:13:47.288230 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d18704d6-f988-4694-9cfd-a0945d8d246b-v4-0-config-user-template-login\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.553811 master-0 kubenswrapper[29612]: I0319 12:13:47.553431 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54d6758664-htjsj"] Mar 19 12:13:47.577671 master-0 kubenswrapper[29612]: I0319 12:13:47.569041 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv2j\" (UniqueName: \"kubernetes.io/projected/d18704d6-f988-4694-9cfd-a0945d8d246b-kube-api-access-brv2j\") pod \"oauth-openshift-54d6758664-htjsj\" (UID: \"d18704d6-f988-4694-9cfd-a0945d8d246b\") " pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:47.577671 master-0 kubenswrapper[29612]: I0319 12:13:47.569814 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-8dc697b97-m8zhf"] Mar 19 12:13:47.577671 master-0 kubenswrapper[29612]: I0319 12:13:47.577042 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-8dc697b97-m8zhf"] Mar 19 12:13:47.638779 master-0 kubenswrapper[29612]: I0319 12:13:47.637367 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"c608e44ace26ca592773c8ddf3b09e35ea02614ffd45d6d0f0d916b8e733410d"} Mar 19 12:13:47.781040 master-0 kubenswrapper[29612]: I0319 12:13:47.780920 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:48.303821 master-0 kubenswrapper[29612]: W0319 12:13:48.303403 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18704d6_f988_4694_9cfd_a0945d8d246b.slice/crio-8dc49cc349e0a7672494299924e1454fc12b51fb2c0d08c4cb40c2bf0781f050 WatchSource:0}: Error finding container 8dc49cc349e0a7672494299924e1454fc12b51fb2c0d08c4cb40c2bf0781f050: Status 404 returned error can't find the container with id 8dc49cc349e0a7672494299924e1454fc12b51fb2c0d08c4cb40c2bf0781f050 Mar 19 12:13:48.305086 master-0 kubenswrapper[29612]: I0319 12:13:48.305026 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-54d6758664-htjsj"] Mar 19 12:13:48.649393 master-0 kubenswrapper[29612]: I0319 12:13:48.649344 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"6551de6ff8b3a46a855da63b1a51b077fed22ad27b721986e45d4589d2e898f9"} Mar 19 12:13:48.649393 master-0 kubenswrapper[29612]: I0319 12:13:48.649399 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"7dddcbb074cf799a720a381399c57360ac3b6bd650c9eacdc4bbebce25f5e235"} Mar 19 12:13:48.650082 master-0 kubenswrapper[29612]: I0319 12:13:48.649410 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"3f01a9cd869a5102f9f9f60bbce1db7d284029e24b476840255d44473a14f440"} Mar 19 12:13:48.650082 master-0 kubenswrapper[29612]: I0319 12:13:48.649419 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"c14714f3b1c11c472587648ffb69c6d1e27549357049d808b49dc03073d19e20"} Mar 19 12:13:48.650082 master-0 kubenswrapper[29612]: I0319 12:13:48.649428 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"c92e0c12ded08b073512e9828807414cfb3c9d56f5805b1f2176c51811654f80"} Mar 19 12:13:48.650082 master-0 kubenswrapper[29612]: I0319 12:13:48.649438 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"06bc4932-418b-41e4-9a3c-00c8d184de32","Type":"ContainerStarted","Data":"0196ec979abd8fbb354c6ccef8666c91ed1053c80f63f40cfe27e29965c4ffce"} Mar 19 12:13:48.653874 master-0 kubenswrapper[29612]: I0319 12:13:48.653788 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"748742753fbf2513f8c336674237be95130a3fa4eec3acb813dc694fc216a6d7"} Mar 19 12:13:48.653874 master-0 kubenswrapper[29612]: I0319 12:13:48.653851 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"842e7be6f39b6cd156344a930727b03367873100a31011ff01f2b1573f878e28"} Mar 19 12:13:48.653874 master-0 kubenswrapper[29612]: I0319 12:13:48.653864 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1d59b018-775c-44e4-a5f0-b11fac9fae2c","Type":"ContainerStarted","Data":"1cbf138272fbbed2a98cc41bfc7acecf297dc4c992d2bb9eef20b9c9518e43b6"} Mar 19 12:13:48.655488 master-0 kubenswrapper[29612]: I0319 12:13:48.655458 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" event={"ID":"d18704d6-f988-4694-9cfd-a0945d8d246b","Type":"ContainerStarted","Data":"774dfe1b6ce60f6b622f3286f854d06006bbee729bbf4bbea4fe62f9f512895a"} Mar 19 12:13:48.655569 master-0 kubenswrapper[29612]: I0319 12:13:48.655489 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" event={"ID":"d18704d6-f988-4694-9cfd-a0945d8d246b","Type":"ContainerStarted","Data":"8dc49cc349e0a7672494299924e1454fc12b51fb2c0d08c4cb40c2bf0781f050"} Mar 19 12:13:48.655829 master-0 kubenswrapper[29612]: I0319 12:13:48.655790 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:48.712350 master-0 kubenswrapper[29612]: I0319 12:13:48.712252 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" podStartSLOduration=39.712226097 podStartE2EDuration="39.712226097s" podCreationTimestamp="2026-03-19 12:13:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:13:48.702345315 +0000 UTC m=+236.016514346" watchObservedRunningTime="2026-03-19 12:13:48.712226097 +0000 UTC m=+236.026395118" Mar 19 12:13:48.712719 master-0 kubenswrapper[29612]: I0319 12:13:48.712683 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.623875631 podStartE2EDuration="17.71267525s" podCreationTimestamp="2026-03-19 12:13:31 +0000 UTC" firstStartedPulling="2026-03-19 12:13:34.495608362 +0000 UTC m=+221.809777383" lastFinishedPulling="2026-03-19 12:13:47.584407981 +0000 UTC m=+234.898577002" observedRunningTime="2026-03-19 12:13:48.681520471 +0000 UTC m=+235.995689492" watchObservedRunningTime="2026-03-19 12:13:48.71267525 +0000 UTC m=+236.026844271" Mar 19 12:13:48.842815 master-0 kubenswrapper[29612]: I0319 12:13:48.842564 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.322698251 podStartE2EDuration="17.842484221s" podCreationTimestamp="2026-03-19 12:13:31 +0000 UTC" firstStartedPulling="2026-03-19 12:13:33.484339889 +0000 UTC m=+220.798508910" lastFinishedPulling="2026-03-19 12:13:45.004125859 +0000 UTC m=+232.318294880" observedRunningTime="2026-03-19 12:13:48.8315596 +0000 UTC m=+236.145728631" watchObservedRunningTime="2026-03-19 12:13:48.842484221 +0000 UTC m=+236.156653262" Mar 19 12:13:48.916081 master-0 kubenswrapper[29612]: I0319 12:13:48.916014 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4560e0cf-d401-4508-a8d8-3869ed30ed18" path="/var/lib/kubelet/pods/4560e0cf-d401-4508-a8d8-3869ed30ed18/volumes" Mar 19 12:13:49.457693 master-0 kubenswrapper[29612]: I0319 12:13:49.457609 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-54d6758664-htjsj" Mar 19 12:13:50.814467 master-0 kubenswrapper[29612]: I0319 12:13:50.814402 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:50.815477 master-0 kubenswrapper[29612]: I0319 12:13:50.815441 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:13:50.818324 master-0 kubenswrapper[29612]: I0319 12:13:50.818270 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:13:50.818405 master-0 kubenswrapper[29612]: I0319 12:13:50.818371 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:13:51.394424 master-0 kubenswrapper[29612]: I0319 12:13:51.394350 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:13:51.395325 master-0 kubenswrapper[29612]: I0319 12:13:51.395294 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.401167 master-0 kubenswrapper[29612]: I0319 12:13:51.399733 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 12:13:51.401167 master-0 kubenswrapper[29612]: I0319 12:13:51.399971 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hzxzv" Mar 19 12:13:51.469224 master-0 kubenswrapper[29612]: I0319 12:13:51.469076 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:13:51.571473 master-0 kubenswrapper[29612]: I0319 12:13:51.571414 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-var-lock\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.571473 master-0 kubenswrapper[29612]: I0319 12:13:51.571471 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22b3b134-7beb-4f24-ac6a-d40271fac38a-kube-api-access\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.571962 master-0 kubenswrapper[29612]: I0319 12:13:51.571871 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.673503 master-0 kubenswrapper[29612]: I0319 12:13:51.673421 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.673770 master-0 kubenswrapper[29612]: I0319 12:13:51.673518 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.673770 master-0 kubenswrapper[29612]: I0319 12:13:51.673563 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-var-lock\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.673770 master-0 kubenswrapper[29612]: I0319 12:13:51.673616 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22b3b134-7beb-4f24-ac6a-d40271fac38a-kube-api-access\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.673770 master-0 kubenswrapper[29612]: I0319 12:13:51.673658 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-var-lock\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.690694 master-0 kubenswrapper[29612]: I0319 12:13:51.690600 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22b3b134-7beb-4f24-ac6a-d40271fac38a-kube-api-access\") pod \"installer-5-master-0\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.720841 master-0 kubenswrapper[29612]: I0319 12:13:51.720714 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:13:51.891218 master-0 kubenswrapper[29612]: I0319 12:13:51.891099 29612 patch_prober.go:28] interesting pod/console-85cf74699c-dx6hc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Mar 19 12:13:51.891845 master-0 kubenswrapper[29612]: I0319 12:13:51.891227 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Mar 19 12:13:52.142960 master-0 kubenswrapper[29612]: I0319 12:13:52.142889 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:13:52.151139 master-0 kubenswrapper[29612]: W0319 12:13:52.151067 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod22b3b134_7beb_4f24_ac6a_d40271fac38a.slice/crio-5c7cdb6b0bf69b9ee856ccd7dcf649d9acd902d4d5d04a0835632bef1fe3e67a WatchSource:0}: Error finding container 5c7cdb6b0bf69b9ee856ccd7dcf649d9acd902d4d5d04a0835632bef1fe3e67a: Status 404 returned error can't find the container with id 5c7cdb6b0bf69b9ee856ccd7dcf649d9acd902d4d5d04a0835632bef1fe3e67a Mar 19 12:13:52.395757 master-0 kubenswrapper[29612]: I0319 12:13:52.395699 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:13:52.551951 master-0 kubenswrapper[29612]: I0319 12:13:52.551885 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:52.556382 master-0 kubenswrapper[29612]: I0319 12:13:52.556333 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6ddfc7c8d9-hftjw" Mar 19 12:13:52.696851 master-0 kubenswrapper[29612]: I0319 12:13:52.696662 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"22b3b134-7beb-4f24-ac6a-d40271fac38a","Type":"ContainerStarted","Data":"cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19"} Mar 19 12:13:52.696851 master-0 kubenswrapper[29612]: I0319 12:13:52.696711 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"22b3b134-7beb-4f24-ac6a-d40271fac38a","Type":"ContainerStarted","Data":"5c7cdb6b0bf69b9ee856ccd7dcf649d9acd902d4d5d04a0835632bef1fe3e67a"} Mar 19 12:13:52.726911 master-0 kubenswrapper[29612]: I0319 12:13:52.719424 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=1.719406054 podStartE2EDuration="1.719406054s" podCreationTimestamp="2026-03-19 12:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:13:52.716978014 +0000 UTC m=+240.031147035" watchObservedRunningTime="2026-03-19 12:13:52.719406054 +0000 UTC m=+240.033575075" Mar 19 12:14:00.814787 master-0 kubenswrapper[29612]: I0319 12:14:00.814696 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:14:00.814787 master-0 kubenswrapper[29612]: I0319 12:14:00.814787 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:14:01.891384 master-0 kubenswrapper[29612]: I0319 12:14:01.891285 29612 patch_prober.go:28] interesting pod/console-85cf74699c-dx6hc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Mar 19 12:14:01.891981 master-0 kubenswrapper[29612]: I0319 12:14:01.891397 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Mar 19 12:14:05.429824 master-0 kubenswrapper[29612]: I0319 12:14:05.429720 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-9966f847b-gzmrl" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" containerID="cri-o://106e6e717e0a07a215d277ed863ea494c396b3dcc8b25dc533922967305fec0a" gracePeriod=15 Mar 19 12:14:05.793204 master-0 kubenswrapper[29612]: I0319 12:14:05.793020 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9966f847b-gzmrl_493d3830-b0a6-4a3a-8048-0a6aed134ebf/console/1.log" Mar 19 12:14:05.793468 master-0 kubenswrapper[29612]: I0319 12:14:05.793448 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9966f847b-gzmrl_493d3830-b0a6-4a3a-8048-0a6aed134ebf/console/0.log" Mar 19 12:14:05.793518 master-0 kubenswrapper[29612]: I0319 12:14:05.793480 29612 generic.go:334] "Generic (PLEG): container finished" podID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerID="106e6e717e0a07a215d277ed863ea494c396b3dcc8b25dc533922967305fec0a" exitCode=2 Mar 19 12:14:05.793563 master-0 kubenswrapper[29612]: I0319 12:14:05.793514 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9966f847b-gzmrl" event={"ID":"493d3830-b0a6-4a3a-8048-0a6aed134ebf","Type":"ContainerDied","Data":"106e6e717e0a07a215d277ed863ea494c396b3dcc8b25dc533922967305fec0a"} Mar 19 12:14:05.793563 master-0 kubenswrapper[29612]: I0319 12:14:05.793551 29612 scope.go:117] "RemoveContainer" containerID="cc1debbf2eb1964f475500d179ac1133f88154be5dc3f3a5984ba2d33ce83143" Mar 19 12:14:05.867033 master-0 kubenswrapper[29612]: I0319 12:14:05.867001 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9966f847b-gzmrl_493d3830-b0a6-4a3a-8048-0a6aed134ebf/console/1.log" Mar 19 12:14:05.867323 master-0 kubenswrapper[29612]: I0319 12:14:05.867310 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:14:06.002037 master-0 kubenswrapper[29612]: I0319 12:14:06.001931 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-oauth-config\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.002037 master-0 kubenswrapper[29612]: I0319 12:14:06.002038 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-trusted-ca-bundle\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.002374 master-0 kubenswrapper[29612]: I0319 12:14:06.002108 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-oauth-serving-cert\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.002374 master-0 kubenswrapper[29612]: I0319 12:14:06.002168 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-config\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.002374 master-0 kubenswrapper[29612]: I0319 12:14:06.002277 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-service-ca\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.002374 master-0 kubenswrapper[29612]: I0319 12:14:06.002313 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssft2\" (UniqueName: \"kubernetes.io/projected/493d3830-b0a6-4a3a-8048-0a6aed134ebf-kube-api-access-ssft2\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.002374 master-0 kubenswrapper[29612]: I0319 12:14:06.002354 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") pod \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\" (UID: \"493d3830-b0a6-4a3a-8048-0a6aed134ebf\") " Mar 19 12:14:06.003174 master-0 kubenswrapper[29612]: I0319 12:14:06.003106 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-config" (OuterVolumeSpecName: "console-config") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:06.003253 master-0 kubenswrapper[29612]: I0319 12:14:06.003137 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:06.003430 master-0 kubenswrapper[29612]: I0319 12:14:06.003343 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-service-ca" (OuterVolumeSpecName: "service-ca") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:06.004404 master-0 kubenswrapper[29612]: I0319 12:14:06.004351 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:14:06.005237 master-0 kubenswrapper[29612]: I0319 12:14:06.005125 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:14:06.006513 master-0 kubenswrapper[29612]: I0319 12:14:06.006455 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:14:06.006729 master-0 kubenswrapper[29612]: I0319 12:14:06.006659 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493d3830-b0a6-4a3a-8048-0a6aed134ebf-kube-api-access-ssft2" (OuterVolumeSpecName: "kube-api-access-ssft2") pod "493d3830-b0a6-4a3a-8048-0a6aed134ebf" (UID: "493d3830-b0a6-4a3a-8048-0a6aed134ebf"). InnerVolumeSpecName "kube-api-access-ssft2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.104964 29612 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.105005 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssft2\" (UniqueName: \"kubernetes.io/projected/493d3830-b0a6-4a3a-8048-0a6aed134ebf-kube-api-access-ssft2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.105017 29612 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.105026 29612 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.105036 29612 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.105044 29612 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.105093 master-0 kubenswrapper[29612]: I0319 12:14:06.105052 29612 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/493d3830-b0a6-4a3a-8048-0a6aed134ebf-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:06.801036 master-0 kubenswrapper[29612]: I0319 12:14:06.800979 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9966f847b-gzmrl_493d3830-b0a6-4a3a-8048-0a6aed134ebf/console/1.log" Mar 19 12:14:06.801621 master-0 kubenswrapper[29612]: I0319 12:14:06.801052 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9966f847b-gzmrl" event={"ID":"493d3830-b0a6-4a3a-8048-0a6aed134ebf","Type":"ContainerDied","Data":"eca6da7b88ba11ba6db8a5731c1c758b7cb8dab5164aa34c66e3439a2b03b88c"} Mar 19 12:14:06.801621 master-0 kubenswrapper[29612]: I0319 12:14:06.801087 29612 scope.go:117] "RemoveContainer" containerID="106e6e717e0a07a215d277ed863ea494c396b3dcc8b25dc533922967305fec0a" Mar 19 12:14:06.801621 master-0 kubenswrapper[29612]: I0319 12:14:06.801142 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9966f847b-gzmrl" Mar 19 12:14:07.040152 master-0 kubenswrapper[29612]: I0319 12:14:07.040045 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9966f847b-gzmrl"] Mar 19 12:14:07.089051 master-0 kubenswrapper[29612]: I0319 12:14:07.088832 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:14:07.089392 master-0 kubenswrapper[29612]: I0319 12:14:07.089121 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-5-master-0" podUID="22b3b134-7beb-4f24-ac6a-d40271fac38a" containerName="installer" containerID="cri-o://cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19" gracePeriod=30 Mar 19 12:14:07.094468 master-0 kubenswrapper[29612]: I0319 12:14:07.094397 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9966f847b-gzmrl"] Mar 19 12:14:08.916020 master-0 kubenswrapper[29612]: I0319 12:14:08.915978 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" path="/var/lib/kubelet/pods/493d3830-b0a6-4a3a-8048-0a6aed134ebf/volumes" Mar 19 12:14:10.384674 master-0 kubenswrapper[29612]: I0319 12:14:10.384592 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 12:14:10.385246 master-0 kubenswrapper[29612]: E0319 12:14:10.384947 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" Mar 19 12:14:10.385246 master-0 kubenswrapper[29612]: I0319 12:14:10.384963 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" Mar 19 12:14:10.385246 master-0 kubenswrapper[29612]: I0319 12:14:10.385169 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" Mar 19 12:14:10.385799 master-0 kubenswrapper[29612]: I0319 12:14:10.385764 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.459622 master-0 kubenswrapper[29612]: I0319 12:14:10.459546 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 12:14:10.499620 master-0 kubenswrapper[29612]: I0319 12:14:10.499557 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.499897 master-0 kubenswrapper[29612]: I0319 12:14:10.499678 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kube-api-access\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.499897 master-0 kubenswrapper[29612]: I0319 12:14:10.499749 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-var-lock\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.609415 master-0 kubenswrapper[29612]: I0319 12:14:10.609366 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kube-api-access\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.609750 master-0 kubenswrapper[29612]: I0319 12:14:10.609733 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-var-lock\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.609871 master-0 kubenswrapper[29612]: I0319 12:14:10.609818 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-var-lock\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.609944 master-0 kubenswrapper[29612]: I0319 12:14:10.609928 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.610031 master-0 kubenswrapper[29612]: I0319 12:14:10.609977 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.625593 master-0 kubenswrapper[29612]: I0319 12:14:10.625539 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kube-api-access\") pod \"installer-6-master-0\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.705972 master-0 kubenswrapper[29612]: I0319 12:14:10.705925 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:14:10.814494 master-0 kubenswrapper[29612]: I0319 12:14:10.814414 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:14:10.814719 master-0 kubenswrapper[29612]: I0319 12:14:10.814495 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:14:11.124937 master-0 kubenswrapper[29612]: W0319 12:14:11.124881 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe27a308_dd0c_4e7a_9d49_a4dc112a80bf.slice/crio-6352962b2067d5a27d217a00db2161ee9112db151d6c37436b61f823eafcec75 WatchSource:0}: Error finding container 6352962b2067d5a27d217a00db2161ee9112db151d6c37436b61f823eafcec75: Status 404 returned error can't find the container with id 6352962b2067d5a27d217a00db2161ee9112db151d6c37436b61f823eafcec75 Mar 19 12:14:11.126567 master-0 kubenswrapper[29612]: I0319 12:14:11.126411 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Mar 19 12:14:11.851453 master-0 kubenswrapper[29612]: I0319 12:14:11.851106 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf","Type":"ContainerStarted","Data":"12da8788d56cade55dc76c00e6a0e20cad284f435458069e857cf2bf02b0af99"} Mar 19 12:14:11.851453 master-0 kubenswrapper[29612]: I0319 12:14:11.851186 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf","Type":"ContainerStarted","Data":"6352962b2067d5a27d217a00db2161ee9112db151d6c37436b61f823eafcec75"} Mar 19 12:14:11.874973 master-0 kubenswrapper[29612]: I0319 12:14:11.874698 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-0" podStartSLOduration=1.874669605 podStartE2EDuration="1.874669605s" podCreationTimestamp="2026-03-19 12:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:14:11.87026279 +0000 UTC m=+259.184431831" watchObservedRunningTime="2026-03-19 12:14:11.874669605 +0000 UTC m=+259.188838666" Mar 19 12:14:11.891240 master-0 kubenswrapper[29612]: I0319 12:14:11.891162 29612 patch_prober.go:28] interesting pod/console-85cf74699c-dx6hc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Mar 19 12:14:11.891539 master-0 kubenswrapper[29612]: I0319 12:14:11.891242 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Mar 19 12:14:20.815078 master-0 kubenswrapper[29612]: I0319 12:14:20.815001 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:14:20.815912 master-0 kubenswrapper[29612]: I0319 12:14:20.815098 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:14:21.891564 master-0 kubenswrapper[29612]: I0319 12:14:21.891505 29612 patch_prober.go:28] interesting pod/console-85cf74699c-dx6hc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Mar 19 12:14:21.892105 master-0 kubenswrapper[29612]: I0319 12:14:21.891571 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Mar 19 12:14:23.902923 master-0 kubenswrapper[29612]: I0319 12:14:23.902851 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_22b3b134-7beb-4f24-ac6a-d40271fac38a/installer/0.log" Mar 19 12:14:23.902923 master-0 kubenswrapper[29612]: I0319 12:14:23.902931 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:14:23.962707 master-0 kubenswrapper[29612]: I0319 12:14:23.958305 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_22b3b134-7beb-4f24-ac6a-d40271fac38a/installer/0.log" Mar 19 12:14:23.962707 master-0 kubenswrapper[29612]: I0319 12:14:23.958363 29612 generic.go:334] "Generic (PLEG): container finished" podID="22b3b134-7beb-4f24-ac6a-d40271fac38a" containerID="cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19" exitCode=1 Mar 19 12:14:23.962707 master-0 kubenswrapper[29612]: I0319 12:14:23.958394 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"22b3b134-7beb-4f24-ac6a-d40271fac38a","Type":"ContainerDied","Data":"cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19"} Mar 19 12:14:23.962707 master-0 kubenswrapper[29612]: I0319 12:14:23.958421 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"22b3b134-7beb-4f24-ac6a-d40271fac38a","Type":"ContainerDied","Data":"5c7cdb6b0bf69b9ee856ccd7dcf649d9acd902d4d5d04a0835632bef1fe3e67a"} Mar 19 12:14:23.962707 master-0 kubenswrapper[29612]: I0319 12:14:23.958438 29612 scope.go:117] "RemoveContainer" containerID="cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19" Mar 19 12:14:23.962707 master-0 kubenswrapper[29612]: I0319 12:14:23.958447 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 19 12:14:23.975560 master-0 kubenswrapper[29612]: I0319 12:14:23.975529 29612 scope.go:117] "RemoveContainer" containerID="cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19" Mar 19 12:14:23.976220 master-0 kubenswrapper[29612]: E0319 12:14:23.976177 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19\": container with ID starting with cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19 not found: ID does not exist" containerID="cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19" Mar 19 12:14:23.976324 master-0 kubenswrapper[29612]: I0319 12:14:23.976222 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19"} err="failed to get container status \"cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19\": rpc error: code = NotFound desc = could not find container \"cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19\": container with ID starting with cf7ffe21bf23d6a2116e9b5d003aa89df2193cc75cac17dbf47b98d6f2846a19 not found: ID does not exist" Mar 19 12:14:24.075304 master-0 kubenswrapper[29612]: I0319 12:14:24.075060 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-kubelet-dir\") pod \"22b3b134-7beb-4f24-ac6a-d40271fac38a\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " Mar 19 12:14:24.075545 master-0 kubenswrapper[29612]: I0319 12:14:24.075234 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22b3b134-7beb-4f24-ac6a-d40271fac38a" (UID: "22b3b134-7beb-4f24-ac6a-d40271fac38a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:14:24.075545 master-0 kubenswrapper[29612]: I0319 12:14:24.075413 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22b3b134-7beb-4f24-ac6a-d40271fac38a-kube-api-access\") pod \"22b3b134-7beb-4f24-ac6a-d40271fac38a\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " Mar 19 12:14:24.075545 master-0 kubenswrapper[29612]: I0319 12:14:24.075440 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-var-lock\") pod \"22b3b134-7beb-4f24-ac6a-d40271fac38a\" (UID: \"22b3b134-7beb-4f24-ac6a-d40271fac38a\") " Mar 19 12:14:24.075705 master-0 kubenswrapper[29612]: I0319 12:14:24.075617 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-var-lock" (OuterVolumeSpecName: "var-lock") pod "22b3b134-7beb-4f24-ac6a-d40271fac38a" (UID: "22b3b134-7beb-4f24-ac6a-d40271fac38a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:14:24.076542 master-0 kubenswrapper[29612]: I0319 12:14:24.076479 29612 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:24.076542 master-0 kubenswrapper[29612]: I0319 12:14:24.076500 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/22b3b134-7beb-4f24-ac6a-d40271fac38a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:24.079038 master-0 kubenswrapper[29612]: I0319 12:14:24.078989 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b3b134-7beb-4f24-ac6a-d40271fac38a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22b3b134-7beb-4f24-ac6a-d40271fac38a" (UID: "22b3b134-7beb-4f24-ac6a-d40271fac38a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:14:24.178033 master-0 kubenswrapper[29612]: I0319 12:14:24.177967 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22b3b134-7beb-4f24-ac6a-d40271fac38a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:14:24.307663 master-0 kubenswrapper[29612]: I0319 12:14:24.307580 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:14:24.313861 master-0 kubenswrapper[29612]: I0319 12:14:24.313801 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 19 12:14:24.914354 master-0 kubenswrapper[29612]: I0319 12:14:24.914297 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b3b134-7beb-4f24-ac6a-d40271fac38a" path="/var/lib/kubelet/pods/22b3b134-7beb-4f24-ac6a-d40271fac38a/volumes" Mar 19 12:14:30.813881 master-0 kubenswrapper[29612]: I0319 12:14:30.813818 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:14:30.814648 master-0 kubenswrapper[29612]: I0319 12:14:30.813882 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:14:31.891389 master-0 kubenswrapper[29612]: I0319 12:14:31.891319 29612 patch_prober.go:28] interesting pod/console-85cf74699c-dx6hc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Mar 19 12:14:31.891389 master-0 kubenswrapper[29612]: I0319 12:14:31.891378 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Mar 19 12:14:32.396206 master-0 kubenswrapper[29612]: I0319 12:14:32.396124 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:14:32.426149 master-0 kubenswrapper[29612]: I0319 12:14:32.425721 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:14:33.070443 master-0 kubenswrapper[29612]: I0319 12:14:33.070365 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 12:14:39.043602 master-0 kubenswrapper[29612]: I0319 12:14:39.043542 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85cf74699c-dx6hc"] Mar 19 12:14:39.101816 master-0 kubenswrapper[29612]: I0319 12:14:39.101760 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66759f68dd-6zgw8"] Mar 19 12:14:39.102114 master-0 kubenswrapper[29612]: E0319 12:14:39.102086 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" Mar 19 12:14:39.102114 master-0 kubenswrapper[29612]: I0319 12:14:39.102106 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" Mar 19 12:14:39.102225 master-0 kubenswrapper[29612]: E0319 12:14:39.102126 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b3b134-7beb-4f24-ac6a-d40271fac38a" containerName="installer" Mar 19 12:14:39.102225 master-0 kubenswrapper[29612]: I0319 12:14:39.102133 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b3b134-7beb-4f24-ac6a-d40271fac38a" containerName="installer" Mar 19 12:14:39.102330 master-0 kubenswrapper[29612]: I0319 12:14:39.102253 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="493d3830-b0a6-4a3a-8048-0a6aed134ebf" containerName="console" Mar 19 12:14:39.102330 master-0 kubenswrapper[29612]: I0319 12:14:39.102301 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b3b134-7beb-4f24-ac6a-d40271fac38a" containerName="installer" Mar 19 12:14:39.102796 master-0 kubenswrapper[29612]: I0319 12:14:39.102769 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.121589 master-0 kubenswrapper[29612]: I0319 12:14:39.121527 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66759f68dd-6zgw8"] Mar 19 12:14:39.211865 master-0 kubenswrapper[29612]: I0319 12:14:39.211748 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-oauth-serving-cert\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.211865 master-0 kubenswrapper[29612]: I0319 12:14:39.211828 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-serving-cert\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.211865 master-0 kubenswrapper[29612]: I0319 12:14:39.211876 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-service-ca\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.212540 master-0 kubenswrapper[29612]: I0319 12:14:39.211933 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-oauth-config\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.212540 master-0 kubenswrapper[29612]: I0319 12:14:39.212284 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg7mt\" (UniqueName: \"kubernetes.io/projected/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-kube-api-access-vg7mt\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.212540 master-0 kubenswrapper[29612]: I0319 12:14:39.212338 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-config\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.212540 master-0 kubenswrapper[29612]: I0319 12:14:39.212368 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-trusted-ca-bundle\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.313682 master-0 kubenswrapper[29612]: I0319 12:14:39.313480 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-service-ca\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.313682 master-0 kubenswrapper[29612]: I0319 12:14:39.313554 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-oauth-config\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.313682 master-0 kubenswrapper[29612]: I0319 12:14:39.313588 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg7mt\" (UniqueName: \"kubernetes.io/projected/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-kube-api-access-vg7mt\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.313682 master-0 kubenswrapper[29612]: I0319 12:14:39.313617 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-config\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.313682 master-0 kubenswrapper[29612]: I0319 12:14:39.313665 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-trusted-ca-bundle\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.314206 master-0 kubenswrapper[29612]: I0319 12:14:39.313710 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-oauth-serving-cert\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.314257 master-0 kubenswrapper[29612]: I0319 12:14:39.314204 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-serving-cert\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.314843 master-0 kubenswrapper[29612]: I0319 12:14:39.314751 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-service-ca\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.315156 master-0 kubenswrapper[29612]: I0319 12:14:39.314852 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-config\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.315316 master-0 kubenswrapper[29612]: I0319 12:14:39.315230 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-oauth-serving-cert\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.315711 master-0 kubenswrapper[29612]: I0319 12:14:39.315657 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-trusted-ca-bundle\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.322708 master-0 kubenswrapper[29612]: I0319 12:14:39.317961 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-serving-cert\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.323341 master-0 kubenswrapper[29612]: I0319 12:14:39.323323 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-oauth-config\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.331199 master-0 kubenswrapper[29612]: I0319 12:14:39.331143 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg7mt\" (UniqueName: \"kubernetes.io/projected/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-kube-api-access-vg7mt\") pod \"console-66759f68dd-6zgw8\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.422267 master-0 kubenswrapper[29612]: I0319 12:14:39.421726 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:39.851615 master-0 kubenswrapper[29612]: I0319 12:14:39.851492 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66759f68dd-6zgw8"] Mar 19 12:14:39.851864 master-0 kubenswrapper[29612]: W0319 12:14:39.851779 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50bfdf1d_36b9_47a9_a65f_1a5bc1a89dcf.slice/crio-b6ba7390d02fd46b2c6cf5b9563e7080f84f40575dd156c3fc6fc240152e723f WatchSource:0}: Error finding container b6ba7390d02fd46b2c6cf5b9563e7080f84f40575dd156c3fc6fc240152e723f: Status 404 returned error can't find the container with id b6ba7390d02fd46b2c6cf5b9563e7080f84f40575dd156c3fc6fc240152e723f Mar 19 12:14:40.088563 master-0 kubenswrapper[29612]: I0319 12:14:40.088411 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66759f68dd-6zgw8" event={"ID":"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf","Type":"ContainerStarted","Data":"bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78"} Mar 19 12:14:40.088563 master-0 kubenswrapper[29612]: I0319 12:14:40.088465 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66759f68dd-6zgw8" event={"ID":"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf","Type":"ContainerStarted","Data":"b6ba7390d02fd46b2c6cf5b9563e7080f84f40575dd156c3fc6fc240152e723f"} Mar 19 12:14:40.109524 master-0 kubenswrapper[29612]: I0319 12:14:40.109345 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66759f68dd-6zgw8" podStartSLOduration=1.109325265 podStartE2EDuration="1.109325265s" podCreationTimestamp="2026-03-19 12:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:14:40.105540589 +0000 UTC m=+287.419709610" watchObservedRunningTime="2026-03-19 12:14:40.109325265 +0000 UTC m=+287.423494286" Mar 19 12:14:40.814346 master-0 kubenswrapper[29612]: I0319 12:14:40.814250 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:14:40.814346 master-0 kubenswrapper[29612]: I0319 12:14:40.814339 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:14:49.422022 master-0 kubenswrapper[29612]: I0319 12:14:49.421963 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:49.422022 master-0 kubenswrapper[29612]: I0319 12:14:49.422033 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:14:49.423591 master-0 kubenswrapper[29612]: I0319 12:14:49.423276 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:14:49.423591 master-0 kubenswrapper[29612]: I0319 12:14:49.423330 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:14:50.814668 master-0 kubenswrapper[29612]: I0319 12:14:50.814563 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:14:50.814668 master-0 kubenswrapper[29612]: I0319 12:14:50.814656 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:14:52.907018 master-0 kubenswrapper[29612]: I0319 12:14:52.906953 29612 kubelet.go:1505] "Image garbage collection succeeded" Mar 19 12:14:59.423101 master-0 kubenswrapper[29612]: I0319 12:14:59.423037 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:14:59.423710 master-0 kubenswrapper[29612]: I0319 12:14:59.423097 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:15:00.814937 master-0 kubenswrapper[29612]: I0319 12:15:00.814853 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:15:00.814937 master-0 kubenswrapper[29612]: I0319 12:15:00.814921 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:15:04.085753 master-0 kubenswrapper[29612]: I0319 12:15:04.085586 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-85cf74699c-dx6hc" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" containerID="cri-o://9842570da6fc9209f93eeff83a6ad10196416f4ac3db591886b8075cef88bfb8" gracePeriod=15 Mar 19 12:15:04.316615 master-0 kubenswrapper[29612]: I0319 12:15:04.316534 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85cf74699c-dx6hc_dbc85eb5-feb1-46f8-b04c-e419c6300b18/console/0.log" Mar 19 12:15:04.316615 master-0 kubenswrapper[29612]: I0319 12:15:04.316605 29612 generic.go:334] "Generic (PLEG): container finished" podID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerID="9842570da6fc9209f93eeff83a6ad10196416f4ac3db591886b8075cef88bfb8" exitCode=2 Mar 19 12:15:04.317053 master-0 kubenswrapper[29612]: I0319 12:15:04.316665 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85cf74699c-dx6hc" event={"ID":"dbc85eb5-feb1-46f8-b04c-e419c6300b18","Type":"ContainerDied","Data":"9842570da6fc9209f93eeff83a6ad10196416f4ac3db591886b8075cef88bfb8"} Mar 19 12:15:04.535786 master-0 kubenswrapper[29612]: I0319 12:15:04.535730 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85cf74699c-dx6hc_dbc85eb5-feb1-46f8-b04c-e419c6300b18/console/0.log" Mar 19 12:15:04.536022 master-0 kubenswrapper[29612]: I0319 12:15:04.535816 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:15:04.714687 master-0 kubenswrapper[29612]: I0319 12:15:04.714472 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-trusted-ca-bundle\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.714935 master-0 kubenswrapper[29612]: I0319 12:15:04.714723 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-service-ca\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.714935 master-0 kubenswrapper[29612]: I0319 12:15:04.714856 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-config\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.714935 master-0 kubenswrapper[29612]: I0319 12:15:04.714899 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hft2\" (UniqueName: \"kubernetes.io/projected/dbc85eb5-feb1-46f8-b04c-e419c6300b18-kube-api-access-6hft2\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.715034 master-0 kubenswrapper[29612]: I0319 12:15:04.714996 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-oauth-config\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.715161 master-0 kubenswrapper[29612]: I0319 12:15:04.715094 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:15:04.715204 master-0 kubenswrapper[29612]: I0319 12:15:04.715127 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-serving-cert\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.715280 master-0 kubenswrapper[29612]: I0319 12:15:04.715254 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-oauth-serving-cert\") pod \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\" (UID: \"dbc85eb5-feb1-46f8-b04c-e419c6300b18\") " Mar 19 12:15:04.716245 master-0 kubenswrapper[29612]: I0319 12:15:04.716179 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-service-ca" (OuterVolumeSpecName: "service-ca") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:15:04.716326 master-0 kubenswrapper[29612]: I0319 12:15:04.716286 29612 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:04.716365 master-0 kubenswrapper[29612]: I0319 12:15:04.716311 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-config" (OuterVolumeSpecName: "console-config") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:15:04.718148 master-0 kubenswrapper[29612]: I0319 12:15:04.718076 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:15:04.718807 master-0 kubenswrapper[29612]: I0319 12:15:04.718744 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc85eb5-feb1-46f8-b04c-e419c6300b18-kube-api-access-6hft2" (OuterVolumeSpecName: "kube-api-access-6hft2") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "kube-api-access-6hft2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:04.719360 master-0 kubenswrapper[29612]: I0319 12:15:04.719317 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:15:04.720367 master-0 kubenswrapper[29612]: I0319 12:15:04.720309 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dbc85eb5-feb1-46f8-b04c-e419c6300b18" (UID: "dbc85eb5-feb1-46f8-b04c-e419c6300b18"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:15:04.817747 master-0 kubenswrapper[29612]: I0319 12:15:04.817687 29612 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:04.817747 master-0 kubenswrapper[29612]: I0319 12:15:04.817764 29612 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:04.818075 master-0 kubenswrapper[29612]: I0319 12:15:04.817783 29612 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:04.818075 master-0 kubenswrapper[29612]: I0319 12:15:04.817796 29612 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:04.818075 master-0 kubenswrapper[29612]: I0319 12:15:04.817808 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hft2\" (UniqueName: \"kubernetes.io/projected/dbc85eb5-feb1-46f8-b04c-e419c6300b18-kube-api-access-6hft2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:04.818075 master-0 kubenswrapper[29612]: I0319 12:15:04.817818 29612 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dbc85eb5-feb1-46f8-b04c-e419c6300b18-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:05.329781 master-0 kubenswrapper[29612]: I0319 12:15:05.329671 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85cf74699c-dx6hc_dbc85eb5-feb1-46f8-b04c-e419c6300b18/console/0.log" Mar 19 12:15:05.330602 master-0 kubenswrapper[29612]: I0319 12:15:05.329803 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85cf74699c-dx6hc" event={"ID":"dbc85eb5-feb1-46f8-b04c-e419c6300b18","Type":"ContainerDied","Data":"1e9ac01b466865efb385598ff516c93cd1e57e480992d03db6af5290d63f6658"} Mar 19 12:15:05.330602 master-0 kubenswrapper[29612]: I0319 12:15:05.329875 29612 scope.go:117] "RemoveContainer" containerID="9842570da6fc9209f93eeff83a6ad10196416f4ac3db591886b8075cef88bfb8" Mar 19 12:15:05.330602 master-0 kubenswrapper[29612]: I0319 12:15:05.330194 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85cf74699c-dx6hc" Mar 19 12:15:05.388430 master-0 kubenswrapper[29612]: I0319 12:15:05.388176 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85cf74699c-dx6hc"] Mar 19 12:15:05.394339 master-0 kubenswrapper[29612]: I0319 12:15:05.394246 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85cf74699c-dx6hc"] Mar 19 12:15:06.916791 master-0 kubenswrapper[29612]: I0319 12:15:06.916691 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" path="/var/lib/kubelet/pods/dbc85eb5-feb1-46f8-b04c-e419c6300b18/volumes" Mar 19 12:15:09.155844 master-0 kubenswrapper[29612]: I0319 12:15:09.155772 29612 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:15:09.156362 master-0 kubenswrapper[29612]: E0319 12:15:09.156136 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" Mar 19 12:15:09.156362 master-0 kubenswrapper[29612]: I0319 12:15:09.156152 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" Mar 19 12:15:09.156436 master-0 kubenswrapper[29612]: I0319 12:15:09.156403 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc85eb5-feb1-46f8-b04c-e419c6300b18" containerName="console" Mar 19 12:15:09.156962 master-0 kubenswrapper[29612]: I0319 12:15:09.156935 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.157146 master-0 kubenswrapper[29612]: I0319 12:15:09.157121 29612 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:15:09.157351 master-0 kubenswrapper[29612]: I0319 12:15:09.157321 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" containerID="cri-o://b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834" gracePeriod=15 Mar 19 12:15:09.157351 master-0 kubenswrapper[29612]: I0319 12:15:09.157336 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3" gracePeriod=15 Mar 19 12:15:09.157458 master-0 kubenswrapper[29612]: I0319 12:15:09.157400 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e" gracePeriod=15 Mar 19 12:15:09.157458 master-0 kubenswrapper[29612]: I0319 12:15:09.157407 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8" gracePeriod=15 Mar 19 12:15:09.157618 master-0 kubenswrapper[29612]: I0319 12:15:09.157536 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded" gracePeriod=15 Mar 19 12:15:09.157700 master-0 kubenswrapper[29612]: I0319 12:15:09.157595 29612 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: E0319 12:15:09.157877 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: I0319 12:15:09.157895 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: E0319 12:15:09.157915 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: I0319 12:15:09.157927 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: E0319 12:15:09.157941 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: I0319 12:15:09.157948 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: E0319 12:15:09.157964 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: I0319 12:15:09.157974 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: E0319 12:15:09.157987 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: I0319 12:15:09.157994 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: E0319 12:15:09.158014 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="setup" Mar 19 12:15:09.158166 master-0 kubenswrapper[29612]: I0319 12:15:09.158021 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="setup" Mar 19 12:15:09.158673 master-0 kubenswrapper[29612]: I0319 12:15:09.158179 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-insecure-readyz" Mar 19 12:15:09.158673 master-0 kubenswrapper[29612]: I0319 12:15:09.158196 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-check-endpoints" Mar 19 12:15:09.158673 master-0 kubenswrapper[29612]: I0319 12:15:09.158219 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-syncer" Mar 19 12:15:09.158673 master-0 kubenswrapper[29612]: I0319 12:15:09.158228 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 12:15:09.158673 master-0 kubenswrapper[29612]: I0319 12:15:09.158240 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5ce05b3d592e63f1f92202d52b9635" containerName="kube-apiserver" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196214 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196311 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196348 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196401 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196443 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196477 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196497 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.198518 master-0 kubenswrapper[29612]: I0319 12:15:09.196535 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.271100 master-0 kubenswrapper[29612]: E0319 12:15:09.271015 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.297999 master-0 kubenswrapper[29612]: I0319 12:15:09.297934 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.298114 master-0 kubenswrapper[29612]: I0319 12:15:09.298011 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.298114 master-0 kubenswrapper[29612]: I0319 12:15:09.298089 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.298209 master-0 kubenswrapper[29612]: I0319 12:15:09.298124 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298209 master-0 kubenswrapper[29612]: I0319 12:15:09.298153 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298296 master-0 kubenswrapper[29612]: I0319 12:15:09.298236 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.298296 master-0 kubenswrapper[29612]: I0319 12:15:09.298264 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298375 master-0 kubenswrapper[29612]: I0319 12:15:09.298309 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298419 master-0 kubenswrapper[29612]: I0319 12:15:09.298373 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298795 master-0 kubenswrapper[29612]: I0319 12:15:09.298495 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298795 master-0 kubenswrapper[29612]: I0319 12:15:09.298598 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298795 master-0 kubenswrapper[29612]: I0319 12:15:09.298669 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.298795 master-0 kubenswrapper[29612]: I0319 12:15:09.298703 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.298795 master-0 kubenswrapper[29612]: I0319 12:15:09.298770 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.299039 master-0 kubenswrapper[29612]: I0319 12:15:09.298812 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.299039 master-0 kubenswrapper[29612]: I0319 12:15:09.298841 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/274c4bebf95a655851b2cf276fe43ef7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"274c4bebf95a655851b2cf276fe43ef7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:09.365134 master-0 kubenswrapper[29612]: I0319 12:15:09.365023 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 12:15:09.366562 master-0 kubenswrapper[29612]: I0319 12:15:09.366494 29612 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3" exitCode=0 Mar 19 12:15:09.366562 master-0 kubenswrapper[29612]: I0319 12:15:09.366554 29612 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded" exitCode=0 Mar 19 12:15:09.366690 master-0 kubenswrapper[29612]: I0319 12:15:09.366566 29612 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8" exitCode=0 Mar 19 12:15:09.366690 master-0 kubenswrapper[29612]: I0319 12:15:09.366590 29612 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e" exitCode=2 Mar 19 12:15:09.369682 master-0 kubenswrapper[29612]: I0319 12:15:09.369000 29612 generic.go:334] "Generic (PLEG): container finished" podID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" containerID="12da8788d56cade55dc76c00e6a0e20cad284f435458069e857cf2bf02b0af99" exitCode=0 Mar 19 12:15:09.369682 master-0 kubenswrapper[29612]: I0319 12:15:09.369069 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf","Type":"ContainerDied","Data":"12da8788d56cade55dc76c00e6a0e20cad284f435458069e857cf2bf02b0af99"} Mar 19 12:15:09.370393 master-0 kubenswrapper[29612]: I0319 12:15:09.370330 29612 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:09.370945 master-0 kubenswrapper[29612]: I0319 12:15:09.370880 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:09.423323 master-0 kubenswrapper[29612]: I0319 12:15:09.423163 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:15:09.423323 master-0 kubenswrapper[29612]: I0319 12:15:09.423236 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:15:09.424349 master-0 kubenswrapper[29612]: E0319 12:15:09.424232 29612 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/events/console-66759f68dd-6zgw8.189e3d1b01273e32\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 19 12:15:09.424349 master-0 kubenswrapper[29612]: &Event{ObjectMeta:{console-66759f68dd-6zgw8.189e3d1b01273e32 openshift-console 18556 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-66759f68dd-6zgw8,UID:50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf,APIVersion:v1,ResourceVersion:18486,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.111:8443/health": dial tcp 10.128.0.111:8443: connect: connection refused Mar 19 12:15:09.424349 master-0 kubenswrapper[29612]: body: Mar 19 12:15:09.424349 master-0 kubenswrapper[29612]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:14:49 +0000 UTC,LastTimestamp:2026-03-19 12:15:09.423218163 +0000 UTC m=+316.737387194,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 12:15:09.424349 master-0 kubenswrapper[29612]: > Mar 19 12:15:09.572489 master-0 kubenswrapper[29612]: I0319 12:15:09.572395 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:09.613620 master-0 kubenswrapper[29612]: W0319 12:15:09.613540 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebbfbf2b56df0323ba118d68bfdad8b9.slice/crio-176267edf9179feb463172bc642fb3688f4782d6cafb74446f2196a3f2a218c0 WatchSource:0}: Error finding container 176267edf9179feb463172bc642fb3688f4782d6cafb74446f2196a3f2a218c0: Status 404 returned error can't find the container with id 176267edf9179feb463172bc642fb3688f4782d6cafb74446f2196a3f2a218c0 Mar 19 12:15:10.275809 master-0 kubenswrapper[29612]: E0319 12:15:10.275667 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.277469 master-0 kubenswrapper[29612]: E0319 12:15:10.277401 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.277990 master-0 kubenswrapper[29612]: E0319 12:15:10.277924 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.278372 master-0 kubenswrapper[29612]: E0319 12:15:10.278336 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.278958 master-0 kubenswrapper[29612]: E0319 12:15:10.278909 29612 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.278958 master-0 kubenswrapper[29612]: I0319 12:15:10.278940 29612 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 12:15:10.279364 master-0 kubenswrapper[29612]: E0319 12:15:10.279315 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 19 12:15:10.379810 master-0 kubenswrapper[29612]: I0319 12:15:10.379719 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"ebbfbf2b56df0323ba118d68bfdad8b9","Type":"ContainerStarted","Data":"21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d"} Mar 19 12:15:10.379810 master-0 kubenswrapper[29612]: I0319 12:15:10.379780 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"ebbfbf2b56df0323ba118d68bfdad8b9","Type":"ContainerStarted","Data":"176267edf9179feb463172bc642fb3688f4782d6cafb74446f2196a3f2a218c0"} Mar 19 12:15:10.381368 master-0 kubenswrapper[29612]: E0319 12:15:10.380788 29612 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:10.381368 master-0 kubenswrapper[29612]: I0319 12:15:10.380818 29612 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.381783 master-0 kubenswrapper[29612]: I0319 12:15:10.381725 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.481376 master-0 kubenswrapper[29612]: E0319 12:15:10.481198 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 19 12:15:10.703305 master-0 kubenswrapper[29612]: I0319 12:15:10.703229 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:15:10.704289 master-0 kubenswrapper[29612]: I0319 12:15:10.704228 29612 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.705135 master-0 kubenswrapper[29612]: I0319 12:15:10.705044 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:10.814811 master-0 kubenswrapper[29612]: I0319 12:15:10.814717 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:15:10.814811 master-0 kubenswrapper[29612]: I0319 12:15:10.814810 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:15:10.828656 master-0 kubenswrapper[29612]: I0319 12:15:10.828525 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-var-lock\") pod \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " Mar 19 12:15:10.828997 master-0 kubenswrapper[29612]: I0319 12:15:10.828686 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kube-api-access\") pod \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " Mar 19 12:15:10.828997 master-0 kubenswrapper[29612]: I0319 12:15:10.828698 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-var-lock" (OuterVolumeSpecName: "var-lock") pod "fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" (UID: "fe27a308-dd0c-4e7a-9d49-a4dc112a80bf"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:10.828997 master-0 kubenswrapper[29612]: I0319 12:15:10.828924 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kubelet-dir\") pod \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\" (UID: \"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf\") " Mar 19 12:15:10.829335 master-0 kubenswrapper[29612]: I0319 12:15:10.829154 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" (UID: "fe27a308-dd0c-4e7a-9d49-a4dc112a80bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:10.829826 master-0 kubenswrapper[29612]: I0319 12:15:10.829734 29612 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:10.829826 master-0 kubenswrapper[29612]: I0319 12:15:10.829768 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:10.831670 master-0 kubenswrapper[29612]: I0319 12:15:10.831567 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" (UID: "fe27a308-dd0c-4e7a-9d49-a4dc112a80bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:10.883582 master-0 kubenswrapper[29612]: E0319 12:15:10.883490 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 19 12:15:10.931770 master-0 kubenswrapper[29612]: I0319 12:15:10.931628 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fe27a308-dd0c-4e7a-9d49-a4dc112a80bf-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:11.417172 master-0 kubenswrapper[29612]: I0319 12:15:11.416389 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"fe27a308-dd0c-4e7a-9d49-a4dc112a80bf","Type":"ContainerDied","Data":"6352962b2067d5a27d217a00db2161ee9112db151d6c37436b61f823eafcec75"} Mar 19 12:15:11.417172 master-0 kubenswrapper[29612]: I0319 12:15:11.416436 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6352962b2067d5a27d217a00db2161ee9112db151d6c37436b61f823eafcec75" Mar 19 12:15:11.417172 master-0 kubenswrapper[29612]: I0319 12:15:11.416489 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Mar 19 12:15:11.447875 master-0 kubenswrapper[29612]: I0319 12:15:11.447807 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:11.685240 master-0 kubenswrapper[29612]: E0319 12:15:11.685133 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 19 12:15:11.765091 master-0 kubenswrapper[29612]: I0319 12:15:11.765010 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 12:15:11.766686 master-0 kubenswrapper[29612]: I0319 12:15:11.766626 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:11.768468 master-0 kubenswrapper[29612]: I0319 12:15:11.768418 29612 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:11.769450 master-0 kubenswrapper[29612]: I0319 12:15:11.769409 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:11.868444 master-0 kubenswrapper[29612]: I0319 12:15:11.868311 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 19 12:15:11.869098 master-0 kubenswrapper[29612]: I0319 12:15:11.868542 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:11.869098 master-0 kubenswrapper[29612]: I0319 12:15:11.868625 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 19 12:15:11.869098 master-0 kubenswrapper[29612]: I0319 12:15:11.868798 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") pod \"7d5ce05b3d592e63f1f92202d52b9635\" (UID: \"7d5ce05b3d592e63f1f92202d52b9635\") " Mar 19 12:15:11.869098 master-0 kubenswrapper[29612]: I0319 12:15:11.868793 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:11.869098 master-0 kubenswrapper[29612]: I0319 12:15:11.868836 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "7d5ce05b3d592e63f1f92202d52b9635" (UID: "7d5ce05b3d592e63f1f92202d52b9635"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:11.869420 master-0 kubenswrapper[29612]: I0319 12:15:11.869364 29612 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:11.869420 master-0 kubenswrapper[29612]: I0319 12:15:11.869392 29612 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:11.869420 master-0 kubenswrapper[29612]: I0319 12:15:11.869411 29612 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d5ce05b3d592e63f1f92202d52b9635-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:12.431885 master-0 kubenswrapper[29612]: I0319 12:15:12.431809 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7d5ce05b3d592e63f1f92202d52b9635/kube-apiserver-cert-syncer/0.log" Mar 19 12:15:12.432930 master-0 kubenswrapper[29612]: I0319 12:15:12.432881 29612 generic.go:334] "Generic (PLEG): container finished" podID="7d5ce05b3d592e63f1f92202d52b9635" containerID="b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834" exitCode=0 Mar 19 12:15:12.433004 master-0 kubenswrapper[29612]: I0319 12:15:12.432951 29612 scope.go:117] "RemoveContainer" containerID="a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3" Mar 19 12:15:12.433125 master-0 kubenswrapper[29612]: I0319 12:15:12.433098 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:12.458295 master-0 kubenswrapper[29612]: I0319 12:15:12.458241 29612 scope.go:117] "RemoveContainer" containerID="623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded" Mar 19 12:15:12.458910 master-0 kubenswrapper[29612]: I0319 12:15:12.458867 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:12.459830 master-0 kubenswrapper[29612]: I0319 12:15:12.459497 29612 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:12.480947 master-0 kubenswrapper[29612]: I0319 12:15:12.480905 29612 scope.go:117] "RemoveContainer" containerID="71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8" Mar 19 12:15:12.504089 master-0 kubenswrapper[29612]: I0319 12:15:12.504029 29612 scope.go:117] "RemoveContainer" containerID="d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e" Mar 19 12:15:12.523540 master-0 kubenswrapper[29612]: I0319 12:15:12.523496 29612 scope.go:117] "RemoveContainer" containerID="b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834" Mar 19 12:15:12.543138 master-0 kubenswrapper[29612]: I0319 12:15:12.543091 29612 scope.go:117] "RemoveContainer" containerID="3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa" Mar 19 12:15:12.563200 master-0 kubenswrapper[29612]: I0319 12:15:12.563166 29612 scope.go:117] "RemoveContainer" containerID="a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3" Mar 19 12:15:12.563682 master-0 kubenswrapper[29612]: E0319 12:15:12.563651 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3\": container with ID starting with a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3 not found: ID does not exist" containerID="a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3" Mar 19 12:15:12.563745 master-0 kubenswrapper[29612]: I0319 12:15:12.563690 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3"} err="failed to get container status \"a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3\": rpc error: code = NotFound desc = could not find container \"a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3\": container with ID starting with a783b43f8c72ab9e393ecbc2db02fd01af40bf07eeadd7c466c82bb257b63ea3 not found: ID does not exist" Mar 19 12:15:12.563745 master-0 kubenswrapper[29612]: I0319 12:15:12.563713 29612 scope.go:117] "RemoveContainer" containerID="623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded" Mar 19 12:15:12.564205 master-0 kubenswrapper[29612]: E0319 12:15:12.564152 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded\": container with ID starting with 623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded not found: ID does not exist" containerID="623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded" Mar 19 12:15:12.564263 master-0 kubenswrapper[29612]: I0319 12:15:12.564223 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded"} err="failed to get container status \"623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded\": rpc error: code = NotFound desc = could not find container \"623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded\": container with ID starting with 623216ee21b6ef288227ab4f3a4b7d9965e7ecab7c53312a64eb67d262941ded not found: ID does not exist" Mar 19 12:15:12.564263 master-0 kubenswrapper[29612]: I0319 12:15:12.564252 29612 scope.go:117] "RemoveContainer" containerID="71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8" Mar 19 12:15:12.565450 master-0 kubenswrapper[29612]: E0319 12:15:12.565390 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8\": container with ID starting with 71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8 not found: ID does not exist" containerID="71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8" Mar 19 12:15:12.565450 master-0 kubenswrapper[29612]: I0319 12:15:12.565418 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8"} err="failed to get container status \"71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8\": rpc error: code = NotFound desc = could not find container \"71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8\": container with ID starting with 71702e4b13652a19ffb4a888110af0563d13f9b3eaf31e66844a9d807c8554d8 not found: ID does not exist" Mar 19 12:15:12.565450 master-0 kubenswrapper[29612]: I0319 12:15:12.565436 29612 scope.go:117] "RemoveContainer" containerID="d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e" Mar 19 12:15:12.565861 master-0 kubenswrapper[29612]: E0319 12:15:12.565828 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e\": container with ID starting with d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e not found: ID does not exist" containerID="d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e" Mar 19 12:15:12.565930 master-0 kubenswrapper[29612]: I0319 12:15:12.565857 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e"} err="failed to get container status \"d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e\": rpc error: code = NotFound desc = could not find container \"d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e\": container with ID starting with d2a4d130a119c8267b863410d18ad26371afab594744f7cf545416982db3637e not found: ID does not exist" Mar 19 12:15:12.565930 master-0 kubenswrapper[29612]: I0319 12:15:12.565893 29612 scope.go:117] "RemoveContainer" containerID="b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834" Mar 19 12:15:12.566709 master-0 kubenswrapper[29612]: E0319 12:15:12.566666 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834\": container with ID starting with b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834 not found: ID does not exist" containerID="b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834" Mar 19 12:15:12.566766 master-0 kubenswrapper[29612]: I0319 12:15:12.566711 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834"} err="failed to get container status \"b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834\": rpc error: code = NotFound desc = could not find container \"b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834\": container with ID starting with b3da9f60d6ccdd9223fdae83a2f696a1d03e5beab90c994157d26e8e84a42834 not found: ID does not exist" Mar 19 12:15:12.566766 master-0 kubenswrapper[29612]: I0319 12:15:12.566745 29612 scope.go:117] "RemoveContainer" containerID="3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa" Mar 19 12:15:12.567081 master-0 kubenswrapper[29612]: E0319 12:15:12.567042 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa\": container with ID starting with 3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa not found: ID does not exist" containerID="3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa" Mar 19 12:15:12.567081 master-0 kubenswrapper[29612]: I0319 12:15:12.567071 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa"} err="failed to get container status \"3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa\": rpc error: code = NotFound desc = could not find container \"3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa\": container with ID starting with 3e83064c6fd0a675d99d22edb20a0d6f623cee45277335a72f615f64309af6fa not found: ID does not exist" Mar 19 12:15:12.722887 master-0 kubenswrapper[29612]: E0319 12:15:12.722762 29612 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/events/console-66759f68dd-6zgw8.189e3d1b01273e32\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 19 12:15:12.722887 master-0 kubenswrapper[29612]: &Event{ObjectMeta:{console-66759f68dd-6zgw8.189e3d1b01273e32 openshift-console 18556 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-66759f68dd-6zgw8,UID:50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf,APIVersion:v1,ResourceVersion:18486,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.111:8443/health": dial tcp 10.128.0.111:8443: connect: connection refused Mar 19 12:15:12.722887 master-0 kubenswrapper[29612]: body: Mar 19 12:15:12.722887 master-0 kubenswrapper[29612]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-19 12:14:49 +0000 UTC,LastTimestamp:2026-03-19 12:15:09.423218163 +0000 UTC m=+316.737387194,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 19 12:15:12.722887 master-0 kubenswrapper[29612]: > Mar 19 12:15:12.910652 master-0 kubenswrapper[29612]: I0319 12:15:12.910577 29612 status_manager.go:851] "Failed to get status for pod" podUID="7d5ce05b3d592e63f1f92202d52b9635" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:12.911131 master-0 kubenswrapper[29612]: I0319 12:15:12.911059 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:12.914664 master-0 kubenswrapper[29612]: I0319 12:15:12.914607 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5ce05b3d592e63f1f92202d52b9635" path="/var/lib/kubelet/pods/7d5ce05b3d592e63f1f92202d52b9635/volumes" Mar 19 12:15:13.287209 master-0 kubenswrapper[29612]: E0319 12:15:13.287123 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 19 12:15:16.488832 master-0 kubenswrapper[29612]: E0319 12:15:16.488744 29612 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 19 12:15:19.423342 master-0 kubenswrapper[29612]: I0319 12:15:19.423260 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:15:19.424342 master-0 kubenswrapper[29612]: I0319 12:15:19.423413 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:15:20.776739 master-0 kubenswrapper[29612]: E0319 12:15:20.776663 29612 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:15:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:15:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:15:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T12:15:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:20.777390 master-0 kubenswrapper[29612]: E0319 12:15:20.777341 29612 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:20.777874 master-0 kubenswrapper[29612]: E0319 12:15:20.777835 29612 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:20.778547 master-0 kubenswrapper[29612]: E0319 12:15:20.778489 29612 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:20.779262 master-0 kubenswrapper[29612]: E0319 12:15:20.779217 29612 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:20.779262 master-0 kubenswrapper[29612]: E0319 12:15:20.779251 29612 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 12:15:20.814680 master-0 kubenswrapper[29612]: I0319 12:15:20.814585 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:15:20.814907 master-0 kubenswrapper[29612]: I0319 12:15:20.814681 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:15:20.906188 master-0 kubenswrapper[29612]: I0319 12:15:20.906128 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:20.907417 master-0 kubenswrapper[29612]: I0319 12:15:20.907320 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:20.931427 master-0 kubenswrapper[29612]: I0319 12:15:20.931370 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:20.931427 master-0 kubenswrapper[29612]: I0319 12:15:20.931417 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:20.932355 master-0 kubenswrapper[29612]: E0319 12:15:20.932284 29612 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:20.933252 master-0 kubenswrapper[29612]: I0319 12:15:20.933198 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:20.963266 master-0 kubenswrapper[29612]: W0319 12:15:20.963233 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274c4bebf95a655851b2cf276fe43ef7.slice/crio-0e527dcfdc4a02fdf551f121132295ec77fbef7c6db8a61176c962e81aa100ca WatchSource:0}: Error finding container 0e527dcfdc4a02fdf551f121132295ec77fbef7c6db8a61176c962e81aa100ca: Status 404 returned error can't find the container with id 0e527dcfdc4a02fdf551f121132295ec77fbef7c6db8a61176c962e81aa100ca Mar 19 12:15:21.518555 master-0 kubenswrapper[29612]: I0319 12:15:21.518470 29612 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="51f16294236ee7dac15e79af0441fcc98b5b729ade90ab00c4290ee3a6db0010" exitCode=0 Mar 19 12:15:21.518555 master-0 kubenswrapper[29612]: I0319 12:15:21.518546 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerDied","Data":"51f16294236ee7dac15e79af0441fcc98b5b729ade90ab00c4290ee3a6db0010"} Mar 19 12:15:21.518555 master-0 kubenswrapper[29612]: I0319 12:15:21.518573 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"0e527dcfdc4a02fdf551f121132295ec77fbef7c6db8a61176c962e81aa100ca"} Mar 19 12:15:21.519068 master-0 kubenswrapper[29612]: I0319 12:15:21.518895 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:21.519068 master-0 kubenswrapper[29612]: I0319 12:15:21.518911 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:21.519626 master-0 kubenswrapper[29612]: I0319 12:15:21.519574 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:21.519902 master-0 kubenswrapper[29612]: E0319 12:15:21.519788 29612 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:21.521772 master-0 kubenswrapper[29612]: I0319 12:15:21.521716 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/2.log" Mar 19 12:15:21.523680 master-0 kubenswrapper[29612]: I0319 12:15:21.523582 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:15:21.524380 master-0 kubenswrapper[29612]: I0319 12:15:21.524328 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/1.log" Mar 19 12:15:21.526273 master-0 kubenswrapper[29612]: I0319 12:15:21.525969 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:15:21.526273 master-0 kubenswrapper[29612]: I0319 12:15:21.526062 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13" exitCode=1 Mar 19 12:15:21.526273 master-0 kubenswrapper[29612]: I0319 12:15:21.526120 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerDied","Data":"617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13"} Mar 19 12:15:21.526273 master-0 kubenswrapper[29612]: I0319 12:15:21.526182 29612 scope.go:117] "RemoveContainer" containerID="dfc697cd77881315e28c4429b102a58e90448d7960e79b2437cb6fdd1f387a62" Mar 19 12:15:21.526526 master-0 kubenswrapper[29612]: I0319 12:15:21.526515 29612 scope.go:117] "RemoveContainer" containerID="617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13" Mar 19 12:15:21.526883 master-0 kubenswrapper[29612]: E0319 12:15:21.526816 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:15:21.528758 master-0 kubenswrapper[29612]: I0319 12:15:21.528697 29612 status_manager.go:851] "Failed to get status for pod" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:21.529222 master-0 kubenswrapper[29612]: I0319 12:15:21.529186 29612 status_manager.go:851] "Failed to get status for pod" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" pod="openshift-kube-apiserver/installer-6-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-6-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 19 12:15:22.565474 master-0 kubenswrapper[29612]: I0319 12:15:22.565405 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/2.log" Mar 19 12:15:22.567244 master-0 kubenswrapper[29612]: I0319 12:15:22.567170 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:15:22.570144 master-0 kubenswrapper[29612]: I0319 12:15:22.570058 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:15:22.572442 master-0 kubenswrapper[29612]: I0319 12:15:22.572398 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"18f3e4604e81a05fe8e3a512201671b5a29c55715732287604700f89ed1972f8"} Mar 19 12:15:22.572442 master-0 kubenswrapper[29612]: I0319 12:15:22.572439 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"ac3aa7a695fd359350e99e407d1a74d7652251ad1e747e8ff33ad04deee2aa91"} Mar 19 12:15:22.572586 master-0 kubenswrapper[29612]: I0319 12:15:22.572454 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"8f08211c58174efe9129266906b90deda55cbb73457ea6385eeeaf81914641a8"} Mar 19 12:15:23.582534 master-0 kubenswrapper[29612]: I0319 12:15:23.582474 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"1e2731f41893568b0b3afbb14c0e6bbf7b41b2fe755beb2358f1da9ba3549df6"} Mar 19 12:15:23.582534 master-0 kubenswrapper[29612]: I0319 12:15:23.582525 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"44492df48f3e6e5c6d7924cf87b7f2b6dc5844503edc151818f578faddc32a9f"} Mar 19 12:15:23.583107 master-0 kubenswrapper[29612]: I0319 12:15:23.582652 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:23.583107 master-0 kubenswrapper[29612]: I0319 12:15:23.582836 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:23.583107 master-0 kubenswrapper[29612]: I0319 12:15:23.582853 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:24.671728 master-0 kubenswrapper[29612]: I0319 12:15:24.671672 29612 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:24.672444 master-0 kubenswrapper[29612]: I0319 12:15:24.672293 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:24.672444 master-0 kubenswrapper[29612]: I0319 12:15:24.672318 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:24.673166 master-0 kubenswrapper[29612]: I0319 12:15:24.673026 29612 scope.go:117] "RemoveContainer" containerID="617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13" Mar 19 12:15:24.673526 master-0 kubenswrapper[29612]: E0319 12:15:24.673488 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:15:25.686718 master-0 kubenswrapper[29612]: I0319 12:15:25.686555 29612 scope.go:117] "RemoveContainer" containerID="617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13" Mar 19 12:15:25.687260 master-0 kubenswrapper[29612]: E0319 12:15:25.687189 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager pod=kube-controller-manager-master-0_openshift-kube-controller-manager(91ced7d9a1e8a2019ac2b6a262f0d19e)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" Mar 19 12:15:25.933446 master-0 kubenswrapper[29612]: I0319 12:15:25.933387 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:25.933446 master-0 kubenswrapper[29612]: I0319 12:15:25.933440 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:25.939474 master-0 kubenswrapper[29612]: I0319 12:15:25.939370 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:29.423396 master-0 kubenswrapper[29612]: I0319 12:15:29.423322 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:15:29.424124 master-0 kubenswrapper[29612]: I0319 12:15:29.423406 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:15:29.447975 master-0 kubenswrapper[29612]: I0319 12:15:29.447892 29612 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:29.716559 master-0 kubenswrapper[29612]: I0319 12:15:29.716437 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_274c4bebf95a655851b2cf276fe43ef7/kube-apiserver-check-endpoints/0.log" Mar 19 12:15:29.718937 master-0 kubenswrapper[29612]: I0319 12:15:29.718895 29612 generic.go:334] "Generic (PLEG): container finished" podID="274c4bebf95a655851b2cf276fe43ef7" containerID="1e2731f41893568b0b3afbb14c0e6bbf7b41b2fe755beb2358f1da9ba3549df6" exitCode=255 Mar 19 12:15:29.719029 master-0 kubenswrapper[29612]: I0319 12:15:29.718938 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerDied","Data":"1e2731f41893568b0b3afbb14c0e6bbf7b41b2fe755beb2358f1da9ba3549df6"} Mar 19 12:15:29.719355 master-0 kubenswrapper[29612]: I0319 12:15:29.719320 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:29.719355 master-0 kubenswrapper[29612]: I0319 12:15:29.719348 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:29.722282 master-0 kubenswrapper[29612]: I0319 12:15:29.722258 29612 scope.go:117] "RemoveContainer" containerID="1e2731f41893568b0b3afbb14c0e6bbf7b41b2fe755beb2358f1da9ba3549df6" Mar 19 12:15:29.723461 master-0 kubenswrapper[29612]: I0319 12:15:29.723346 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="62b10b7f-5004-42fd-8c1a-31699d512f26" Mar 19 12:15:29.724559 master-0 kubenswrapper[29612]: I0319 12:15:29.724525 29612 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://8f08211c58174efe9129266906b90deda55cbb73457ea6385eeeaf81914641a8" Mar 19 12:15:29.724559 master-0 kubenswrapper[29612]: I0319 12:15:29.724556 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:30.730101 master-0 kubenswrapper[29612]: I0319 12:15:30.730027 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_274c4bebf95a655851b2cf276fe43ef7/kube-apiserver-check-endpoints/0.log" Mar 19 12:15:30.732695 master-0 kubenswrapper[29612]: I0319 12:15:30.732654 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"274c4bebf95a655851b2cf276fe43ef7","Type":"ContainerStarted","Data":"90916191d1aee57794ff1fad989a0b688a7821bd5744be4ebc70e8726d15d0c5"} Mar 19 12:15:30.732857 master-0 kubenswrapper[29612]: I0319 12:15:30.732840 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:30.732979 master-0 kubenswrapper[29612]: I0319 12:15:30.732938 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:30.732979 master-0 kubenswrapper[29612]: I0319 12:15:30.732968 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:30.815042 master-0 kubenswrapper[29612]: I0319 12:15:30.814958 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:15:30.815250 master-0 kubenswrapper[29612]: I0319 12:15:30.815109 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:15:31.740465 master-0 kubenswrapper[29612]: I0319 12:15:31.739711 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:31.740465 master-0 kubenswrapper[29612]: I0319 12:15:31.739753 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e03eadfa-f7fe-4083-be01-819d071c1dfc" Mar 19 12:15:32.923572 master-0 kubenswrapper[29612]: I0319 12:15:32.923503 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="274c4bebf95a655851b2cf276fe43ef7" podUID="62b10b7f-5004-42fd-8c1a-31699d512f26" Mar 19 12:15:36.091481 master-0 kubenswrapper[29612]: I0319 12:15:36.091407 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 12:15:37.136973 master-0 kubenswrapper[29612]: I0319 12:15:37.136900 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 12:15:37.492068 master-0 kubenswrapper[29612]: I0319 12:15:37.492027 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 19 12:15:37.727953 master-0 kubenswrapper[29612]: I0319 12:15:37.727877 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 12:15:38.907549 master-0 kubenswrapper[29612]: I0319 12:15:38.907475 29612 scope.go:117] "RemoveContainer" containerID="617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13" Mar 19 12:15:38.959910 master-0 kubenswrapper[29612]: I0319 12:15:38.959862 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 12:15:39.422479 master-0 kubenswrapper[29612]: I0319 12:15:39.422366 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:15:39.422479 master-0 kubenswrapper[29612]: I0319 12:15:39.422467 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:15:39.805319 master-0 kubenswrapper[29612]: I0319 12:15:39.805231 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/2.log" Mar 19 12:15:39.807257 master-0 kubenswrapper[29612]: I0319 12:15:39.807164 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:15:39.808553 master-0 kubenswrapper[29612]: I0319 12:15:39.808517 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:15:39.808738 master-0 kubenswrapper[29612]: I0319 12:15:39.808578 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"91ced7d9a1e8a2019ac2b6a262f0d19e","Type":"ContainerStarted","Data":"56f3d6737f59d56669808e7d85a508e574b8afc56126aef4f40a1770b36e3019"} Mar 19 12:15:39.894865 master-0 kubenswrapper[29612]: I0319 12:15:39.894797 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 12:15:39.924481 master-0 kubenswrapper[29612]: I0319 12:15:39.924423 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 12:15:39.972304 master-0 kubenswrapper[29612]: I0319 12:15:39.972213 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 12:15:40.015205 master-0 kubenswrapper[29612]: I0319 12:15:40.015133 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 12:15:40.229989 master-0 kubenswrapper[29612]: I0319 12:15:40.229919 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 19 12:15:40.293704 master-0 kubenswrapper[29612]: I0319 12:15:40.293518 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 12:15:40.599303 master-0 kubenswrapper[29612]: I0319 12:15:40.599075 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 12:15:40.814454 master-0 kubenswrapper[29612]: I0319 12:15:40.814336 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:15:40.814454 master-0 kubenswrapper[29612]: I0319 12:15:40.814440 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:15:40.954764 master-0 kubenswrapper[29612]: I0319 12:15:40.954627 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 12:15:41.009479 master-0 kubenswrapper[29612]: I0319 12:15:41.009415 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 12:15:41.098903 master-0 kubenswrapper[29612]: I0319 12:15:41.098829 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 12:15:41.126848 master-0 kubenswrapper[29612]: I0319 12:15:41.126770 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-bgrvv" Mar 19 12:15:41.429850 master-0 kubenswrapper[29612]: I0319 12:15:41.429790 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 12:15:41.430857 master-0 kubenswrapper[29612]: I0319 12:15:41.430823 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 12:15:41.437061 master-0 kubenswrapper[29612]: I0319 12:15:41.437011 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 12:15:41.703448 master-0 kubenswrapper[29612]: I0319 12:15:41.703308 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 19 12:15:41.755278 master-0 kubenswrapper[29612]: I0319 12:15:41.755201 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 12:15:41.816984 master-0 kubenswrapper[29612]: I0319 12:15:41.816892 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 19 12:15:41.853524 master-0 kubenswrapper[29612]: I0319 12:15:41.853421 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 12:15:42.052075 master-0 kubenswrapper[29612]: I0319 12:15:42.051926 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 12:15:42.132117 master-0 kubenswrapper[29612]: I0319 12:15:42.130734 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 12:15:42.132392 master-0 kubenswrapper[29612]: I0319 12:15:42.132293 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 12:15:42.246757 master-0 kubenswrapper[29612]: I0319 12:15:42.246682 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 12:15:42.279583 master-0 kubenswrapper[29612]: I0319 12:15:42.278299 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 12:15:42.279723 master-0 kubenswrapper[29612]: I0319 12:15:42.279696 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 12:15:42.281477 master-0 kubenswrapper[29612]: I0319 12:15:42.281441 29612 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 12:15:42.343044 master-0 kubenswrapper[29612]: I0319 12:15:42.342992 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 12:15:42.436493 master-0 kubenswrapper[29612]: I0319 12:15:42.436435 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 12:15:42.458821 master-0 kubenswrapper[29612]: I0319 12:15:42.458734 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 12:15:42.568053 master-0 kubenswrapper[29612]: I0319 12:15:42.567890 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:15:42.581740 master-0 kubenswrapper[29612]: I0319 12:15:42.581442 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 19 12:15:42.677581 master-0 kubenswrapper[29612]: I0319 12:15:42.677487 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 12:15:42.774558 master-0 kubenswrapper[29612]: I0319 12:15:42.774507 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 12:15:42.829894 master-0 kubenswrapper[29612]: I0319 12:15:42.829765 29612 generic.go:334] "Generic (PLEG): container finished" podID="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" containerID="5b15f71769cb166bbe1854fa275ab6231d488a761919449830f9713c4bfd28ed" exitCode=0 Mar 19 12:15:42.829894 master-0 kubenswrapper[29612]: I0319 12:15:42.829816 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" event={"ID":"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c","Type":"ContainerDied","Data":"5b15f71769cb166bbe1854fa275ab6231d488a761919449830f9713c4bfd28ed"} Mar 19 12:15:42.891470 master-0 kubenswrapper[29612]: I0319 12:15:42.891389 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 12:15:42.925306 master-0 kubenswrapper[29612]: I0319 12:15:42.925235 29612 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 12:15:42.962915 master-0 kubenswrapper[29612]: I0319 12:15:42.962833 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 12:15:43.007923 master-0 kubenswrapper[29612]: I0319 12:15:43.007851 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 12:15:43.069923 master-0 kubenswrapper[29612]: I0319 12:15:43.069841 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 12:15:43.103474 master-0 kubenswrapper[29612]: I0319 12:15:43.103346 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 12:15:43.135288 master-0 kubenswrapper[29612]: I0319 12:15:43.132009 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 12:15:43.150017 master-0 kubenswrapper[29612]: I0319 12:15:43.149955 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 12:15:43.192190 master-0 kubenswrapper[29612]: I0319 12:15:43.192152 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 12:15:43.192478 master-0 kubenswrapper[29612]: I0319 12:15:43.192165 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:15:43.194967 master-0 kubenswrapper[29612]: I0319 12:15:43.194933 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.195057 master-0 kubenswrapper[29612]: I0319 12:15:43.194995 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.195094 master-0 kubenswrapper[29612]: I0319 12:15:43.195064 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.195153 master-0 kubenswrapper[29612]: I0319 12:15:43.195133 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.195204 master-0 kubenswrapper[29612]: I0319 12:15:43.195168 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.195204 master-0 kubenswrapper[29612]: I0319 12:15:43.195188 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.195290 master-0 kubenswrapper[29612]: I0319 12:15:43.195238 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") pod \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\" (UID: \"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c\") " Mar 19 12:15:43.196398 master-0 kubenswrapper[29612]: I0319 12:15:43.196309 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log" (OuterVolumeSpecName: "audit-log") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:15:43.196398 master-0 kubenswrapper[29612]: I0319 12:15:43.196371 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:15:43.196724 master-0 kubenswrapper[29612]: I0319 12:15:43.196577 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:15:43.199084 master-0 kubenswrapper[29612]: I0319 12:15:43.199028 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:15:43.199575 master-0 kubenswrapper[29612]: I0319 12:15:43.199548 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:15:43.199809 master-0 kubenswrapper[29612]: I0319 12:15:43.199746 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv" (OuterVolumeSpecName: "kube-api-access-hpdhv") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "kube-api-access-hpdhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:15:43.199809 master-0 kubenswrapper[29612]: I0319 12:15:43.199783 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" (UID: "4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:15:43.266846 master-0 kubenswrapper[29612]: I0319 12:15:43.266740 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zgw2m" Mar 19 12:15:43.270958 master-0 kubenswrapper[29612]: I0319 12:15:43.270908 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297475 29612 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297550 29612 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-audit-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297561 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpdhv\" (UniqueName: \"kubernetes.io/projected/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-kube-api-access-hpdhv\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297575 29612 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297586 29612 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297600 29612 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.297585 master-0 kubenswrapper[29612]: I0319 12:15:43.297614 29612 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:43.332137 master-0 kubenswrapper[29612]: I0319 12:15:43.332078 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 12:15:43.413759 master-0 kubenswrapper[29612]: I0319 12:15:43.413717 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 12:15:43.457521 master-0 kubenswrapper[29612]: I0319 12:15:43.457311 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-atris4eemsst" Mar 19 12:15:43.481070 master-0 kubenswrapper[29612]: I0319 12:15:43.481017 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 19 12:15:43.513587 master-0 kubenswrapper[29612]: I0319 12:15:43.513524 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 12:15:43.537377 master-0 kubenswrapper[29612]: I0319 12:15:43.537321 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 19 12:15:43.554362 master-0 kubenswrapper[29612]: I0319 12:15:43.554321 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 12:15:43.594930 master-0 kubenswrapper[29612]: I0319 12:15:43.594881 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 19 12:15:43.646038 master-0 kubenswrapper[29612]: I0319 12:15:43.645975 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 19 12:15:43.689214 master-0 kubenswrapper[29612]: I0319 12:15:43.689093 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 12:15:43.728192 master-0 kubenswrapper[29612]: I0319 12:15:43.728151 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 12:15:43.753368 master-0 kubenswrapper[29612]: I0319 12:15:43.753299 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-6p6sh" Mar 19 12:15:43.809661 master-0 kubenswrapper[29612]: I0319 12:15:43.809563 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 12:15:43.859158 master-0 kubenswrapper[29612]: I0319 12:15:43.858991 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 12:15:43.864993 master-0 kubenswrapper[29612]: I0319 12:15:43.864891 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" event={"ID":"4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c","Type":"ContainerDied","Data":"4daf44a10b1f8c76d5c5e205f3a48dbc84e86cc93251b0495b68e80ef60f9d35"} Mar 19 12:15:43.865186 master-0 kubenswrapper[29612]: I0319 12:15:43.865002 29612 scope.go:117] "RemoveContainer" containerID="5b15f71769cb166bbe1854fa275ab6231d488a761919449830f9713c4bfd28ed" Mar 19 12:15:43.865302 master-0 kubenswrapper[29612]: I0319 12:15:43.865211 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-dc85487cf-gsmlh" Mar 19 12:15:43.920446 master-0 kubenswrapper[29612]: I0319 12:15:43.920396 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 12:15:43.980755 master-0 kubenswrapper[29612]: I0319 12:15:43.980590 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 12:15:43.999889 master-0 kubenswrapper[29612]: I0319 12:15:43.999811 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 12:15:44.109228 master-0 kubenswrapper[29612]: I0319 12:15:44.109096 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 19 12:15:44.115542 master-0 kubenswrapper[29612]: I0319 12:15:44.115471 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 12:15:44.130113 master-0 kubenswrapper[29612]: I0319 12:15:44.130047 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 12:15:44.139233 master-0 kubenswrapper[29612]: I0319 12:15:44.139200 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 12:15:44.183718 master-0 kubenswrapper[29612]: I0319 12:15:44.183615 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 12:15:44.280001 master-0 kubenswrapper[29612]: I0319 12:15:44.279803 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 12:15:44.307751 master-0 kubenswrapper[29612]: I0319 12:15:44.307665 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 12:15:44.438060 master-0 kubenswrapper[29612]: I0319 12:15:44.437973 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 12:15:44.504303 master-0 kubenswrapper[29612]: I0319 12:15:44.504203 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 12:15:44.577900 master-0 kubenswrapper[29612]: I0319 12:15:44.577547 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:44.577900 master-0 kubenswrapper[29612]: I0319 12:15:44.577613 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:44.586922 master-0 kubenswrapper[29612]: I0319 12:15:44.586871 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:44.610382 master-0 kubenswrapper[29612]: I0319 12:15:44.610330 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 12:15:44.617366 master-0 kubenswrapper[29612]: I0319 12:15:44.617320 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 12:15:44.622705 master-0 kubenswrapper[29612]: I0319 12:15:44.622618 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 12:15:44.642948 master-0 kubenswrapper[29612]: I0319 12:15:44.642871 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 12:15:44.672418 master-0 kubenswrapper[29612]: I0319 12:15:44.672324 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 12:15:44.685992 master-0 kubenswrapper[29612]: I0319 12:15:44.685927 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 12:15:44.752740 master-0 kubenswrapper[29612]: I0319 12:15:44.752669 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 12:15:44.829072 master-0 kubenswrapper[29612]: I0319 12:15:44.828932 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 12:15:44.838240 master-0 kubenswrapper[29612]: I0319 12:15:44.838201 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 12:15:44.923717 master-0 kubenswrapper[29612]: I0319 12:15:44.923669 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 12:15:45.037599 master-0 kubenswrapper[29612]: I0319 12:15:45.037514 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 19 12:15:45.070502 master-0 kubenswrapper[29612]: I0319 12:15:45.070434 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 12:15:45.114521 master-0 kubenswrapper[29612]: I0319 12:15:45.114400 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 19 12:15:45.169745 master-0 kubenswrapper[29612]: I0319 12:15:45.169511 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 12:15:45.226081 master-0 kubenswrapper[29612]: I0319 12:15:45.226030 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 12:15:45.244563 master-0 kubenswrapper[29612]: I0319 12:15:45.244505 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 19 12:15:45.307187 master-0 kubenswrapper[29612]: I0319 12:15:45.307129 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 12:15:45.324179 master-0 kubenswrapper[29612]: I0319 12:15:45.324120 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 19 12:15:45.376238 master-0 kubenswrapper[29612]: I0319 12:15:45.376067 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 19 12:15:45.387282 master-0 kubenswrapper[29612]: I0319 12:15:45.387192 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 12:15:45.397485 master-0 kubenswrapper[29612]: I0319 12:15:45.397431 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 19 12:15:45.427129 master-0 kubenswrapper[29612]: I0319 12:15:45.427058 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 12:15:45.431126 master-0 kubenswrapper[29612]: I0319 12:15:45.431061 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-f7v29" Mar 19 12:15:45.447127 master-0 kubenswrapper[29612]: I0319 12:15:45.447057 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 19 12:15:45.553788 master-0 kubenswrapper[29612]: I0319 12:15:45.553718 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 12:15:45.620033 master-0 kubenswrapper[29612]: I0319 12:15:45.619993 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9ww6r" Mar 19 12:15:45.631456 master-0 kubenswrapper[29612]: I0319 12:15:45.631351 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 12:15:45.636624 master-0 kubenswrapper[29612]: I0319 12:15:45.636592 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 12:15:45.637270 master-0 kubenswrapper[29612]: I0319 12:15:45.637253 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 12:15:45.646419 master-0 kubenswrapper[29612]: I0319 12:15:45.646399 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 12:15:45.730562 master-0 kubenswrapper[29612]: I0319 12:15:45.730496 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 12:15:45.737151 master-0 kubenswrapper[29612]: I0319 12:15:45.736424 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-e7dkr757t87sf" Mar 19 12:15:45.866339 master-0 kubenswrapper[29612]: I0319 12:15:45.866260 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 12:15:45.868462 master-0 kubenswrapper[29612]: I0319 12:15:45.868404 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-jj7tw" Mar 19 12:15:45.925832 master-0 kubenswrapper[29612]: I0319 12:15:45.925760 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 12:15:46.033800 master-0 kubenswrapper[29612]: I0319 12:15:46.033745 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 12:15:46.036099 master-0 kubenswrapper[29612]: I0319 12:15:46.036062 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 12:15:46.049604 master-0 kubenswrapper[29612]: I0319 12:15:46.049543 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-g78hg" Mar 19 12:15:46.054796 master-0 kubenswrapper[29612]: I0319 12:15:46.054769 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 19 12:15:46.137784 master-0 kubenswrapper[29612]: I0319 12:15:46.137727 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 12:15:46.181942 master-0 kubenswrapper[29612]: I0319 12:15:46.181744 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 12:15:46.209934 master-0 kubenswrapper[29612]: I0319 12:15:46.209866 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 12:15:46.307201 master-0 kubenswrapper[29612]: I0319 12:15:46.307134 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 12:15:46.313984 master-0 kubenswrapper[29612]: I0319 12:15:46.313928 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 12:15:46.389603 master-0 kubenswrapper[29612]: I0319 12:15:46.389559 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 12:15:46.394487 master-0 kubenswrapper[29612]: I0319 12:15:46.394428 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 12:15:46.442538 master-0 kubenswrapper[29612]: I0319 12:15:46.440875 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 12:15:46.496007 master-0 kubenswrapper[29612]: I0319 12:15:46.495940 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 12:15:46.500309 master-0 kubenswrapper[29612]: I0319 12:15:46.500258 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 12:15:46.523619 master-0 kubenswrapper[29612]: I0319 12:15:46.523529 29612 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 12:15:46.541687 master-0 kubenswrapper[29612]: I0319 12:15:46.541240 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-3m82ns9hegv5s" Mar 19 12:15:46.541687 master-0 kubenswrapper[29612]: I0319 12:15:46.541573 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 12:15:46.546900 master-0 kubenswrapper[29612]: I0319 12:15:46.545426 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-monitoring/metrics-server-dc85487cf-gsmlh"] Mar 19 12:15:46.546900 master-0 kubenswrapper[29612]: I0319 12:15:46.546288 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 19 12:15:46.566664 master-0 kubenswrapper[29612]: I0319 12:15:46.553170 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-56gmj" Mar 19 12:15:46.575660 master-0 kubenswrapper[29612]: I0319 12:15:46.570253 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 19 12:15:46.593793 master-0 kubenswrapper[29612]: I0319 12:15:46.592212 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=17.592187996 podStartE2EDuration="17.592187996s" podCreationTimestamp="2026-03-19 12:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:15:46.586625539 +0000 UTC m=+353.900794580" watchObservedRunningTime="2026-03-19 12:15:46.592187996 +0000 UTC m=+353.906357027" Mar 19 12:15:46.600196 master-0 kubenswrapper[29612]: I0319 12:15:46.600117 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 12:15:46.620729 master-0 kubenswrapper[29612]: I0319 12:15:46.620677 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 12:15:46.620945 master-0 kubenswrapper[29612]: I0319 12:15:46.620916 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 12:15:46.629119 master-0 kubenswrapper[29612]: I0319 12:15:46.629087 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 12:15:46.678286 master-0 kubenswrapper[29612]: I0319 12:15:46.678215 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-wn2jn" Mar 19 12:15:46.695345 master-0 kubenswrapper[29612]: I0319 12:15:46.695263 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 19 12:15:46.709020 master-0 kubenswrapper[29612]: I0319 12:15:46.708988 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-kl9xx" Mar 19 12:15:46.747783 master-0 kubenswrapper[29612]: I0319 12:15:46.747687 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 12:15:46.782425 master-0 kubenswrapper[29612]: I0319 12:15:46.782370 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 12:15:46.836444 master-0 kubenswrapper[29612]: I0319 12:15:46.836378 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 12:15:46.863828 master-0 kubenswrapper[29612]: I0319 12:15:46.863768 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 12:15:46.880432 master-0 kubenswrapper[29612]: I0319 12:15:46.880407 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 12:15:46.910006 master-0 kubenswrapper[29612]: I0319 12:15:46.909915 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 12:15:46.916524 master-0 kubenswrapper[29612]: I0319 12:15:46.916339 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" path="/var/lib/kubelet/pods/4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c/volumes" Mar 19 12:15:46.946120 master-0 kubenswrapper[29612]: I0319 12:15:46.945870 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 19 12:15:46.946455 master-0 kubenswrapper[29612]: I0319 12:15:46.946234 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 12:15:46.948998 master-0 kubenswrapper[29612]: I0319 12:15:46.948670 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-pshw6" Mar 19 12:15:47.012049 master-0 kubenswrapper[29612]: I0319 12:15:47.010723 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 12:15:47.096788 master-0 kubenswrapper[29612]: I0319 12:15:47.096736 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 12:15:47.196487 master-0 kubenswrapper[29612]: I0319 12:15:47.196341 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 12:15:47.210070 master-0 kubenswrapper[29612]: I0319 12:15:47.210014 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 19 12:15:47.212472 master-0 kubenswrapper[29612]: I0319 12:15:47.212433 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 12:15:47.262759 master-0 kubenswrapper[29612]: I0319 12:15:47.262698 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 12:15:47.484822 master-0 kubenswrapper[29612]: I0319 12:15:47.484519 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 12:15:47.485017 master-0 kubenswrapper[29612]: I0319 12:15:47.484826 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 19 12:15:47.536691 master-0 kubenswrapper[29612]: I0319 12:15:47.535771 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 12:15:47.725862 master-0 kubenswrapper[29612]: I0319 12:15:47.725804 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 12:15:47.835867 master-0 kubenswrapper[29612]: I0319 12:15:47.835762 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 12:15:47.926597 master-0 kubenswrapper[29612]: I0319 12:15:47.926543 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 12:15:48.051783 master-0 kubenswrapper[29612]: I0319 12:15:48.051694 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 12:15:48.214817 master-0 kubenswrapper[29612]: I0319 12:15:48.214762 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 19 12:15:48.234846 master-0 kubenswrapper[29612]: I0319 12:15:48.234799 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 12:15:48.235056 master-0 kubenswrapper[29612]: I0319 12:15:48.235017 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 12:15:48.257370 master-0 kubenswrapper[29612]: I0319 12:15:48.257306 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 12:15:48.273794 master-0 kubenswrapper[29612]: I0319 12:15:48.273731 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 12:15:48.281200 master-0 kubenswrapper[29612]: I0319 12:15:48.281160 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 12:15:48.452586 master-0 kubenswrapper[29612]: I0319 12:15:48.452529 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 12:15:48.500648 master-0 kubenswrapper[29612]: I0319 12:15:48.499977 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-gjszs" Mar 19 12:15:48.500648 master-0 kubenswrapper[29612]: I0319 12:15:48.500380 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 12:15:48.501608 master-0 kubenswrapper[29612]: I0319 12:15:48.501201 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 12:15:48.553815 master-0 kubenswrapper[29612]: I0319 12:15:48.553529 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 12:15:48.559703 master-0 kubenswrapper[29612]: I0319 12:15:48.559398 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 19 12:15:48.624650 master-0 kubenswrapper[29612]: I0319 12:15:48.624521 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-flatfile-config" Mar 19 12:15:48.710969 master-0 kubenswrapper[29612]: I0319 12:15:48.710546 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 12:15:48.714043 master-0 kubenswrapper[29612]: I0319 12:15:48.714019 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 12:15:48.735717 master-0 kubenswrapper[29612]: I0319 12:15:48.735664 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 12:15:48.749462 master-0 kubenswrapper[29612]: I0319 12:15:48.749403 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 12:15:48.777351 master-0 kubenswrapper[29612]: I0319 12:15:48.777207 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-xm92n" Mar 19 12:15:48.816679 master-0 kubenswrapper[29612]: I0319 12:15:48.816604 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 12:15:48.819131 master-0 kubenswrapper[29612]: I0319 12:15:48.819081 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 19 12:15:48.929611 master-0 kubenswrapper[29612]: I0319 12:15:48.929534 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 12:15:48.951997 master-0 kubenswrapper[29612]: I0319 12:15:48.951915 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 12:15:48.972618 master-0 kubenswrapper[29612]: I0319 12:15:48.972543 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 12:15:49.060499 master-0 kubenswrapper[29612]: I0319 12:15:49.060389 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 12:15:49.102974 master-0 kubenswrapper[29612]: I0319 12:15:49.102935 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-mln6s" Mar 19 12:15:49.119318 master-0 kubenswrapper[29612]: I0319 12:15:49.119200 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 19 12:15:49.163172 master-0 kubenswrapper[29612]: I0319 12:15:49.163099 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 12:15:49.180565 master-0 kubenswrapper[29612]: I0319 12:15:49.180506 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 12:15:49.188884 master-0 kubenswrapper[29612]: I0319 12:15:49.188836 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 12:15:49.234655 master-0 kubenswrapper[29612]: I0319 12:15:49.234580 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 12:15:49.237876 master-0 kubenswrapper[29612]: I0319 12:15:49.237839 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 12:15:49.292189 master-0 kubenswrapper[29612]: I0319 12:15:49.292145 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:15:49.309841 master-0 kubenswrapper[29612]: I0319 12:15:49.309818 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 12:15:49.419494 master-0 kubenswrapper[29612]: I0319 12:15:49.419439 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 12:15:49.422567 master-0 kubenswrapper[29612]: I0319 12:15:49.422544 29612 patch_prober.go:28] interesting pod/console-66759f68dd-6zgw8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" start-of-body= Mar 19 12:15:49.422745 master-0 kubenswrapper[29612]: I0319 12:15:49.422717 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" probeResult="failure" output="Get \"https://10.128.0.111:8443/health\": dial tcp 10.128.0.111:8443: connect: connection refused" Mar 19 12:15:49.432270 master-0 kubenswrapper[29612]: I0319 12:15:49.432231 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 12:15:49.442439 master-0 kubenswrapper[29612]: I0319 12:15:49.442395 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 12:15:49.477163 master-0 kubenswrapper[29612]: I0319 12:15:49.477051 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 19 12:15:49.496845 master-0 kubenswrapper[29612]: I0319 12:15:49.496784 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 12:15:49.517769 master-0 kubenswrapper[29612]: I0319 12:15:49.517722 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 19 12:15:49.542426 master-0 kubenswrapper[29612]: I0319 12:15:49.542385 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4lzgv" Mar 19 12:15:49.549268 master-0 kubenswrapper[29612]: I0319 12:15:49.549218 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 12:15:49.598823 master-0 kubenswrapper[29612]: I0319 12:15:49.598712 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 12:15:49.663503 master-0 kubenswrapper[29612]: I0319 12:15:49.663445 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 12:15:49.703721 master-0 kubenswrapper[29612]: I0319 12:15:49.703522 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-tsgvs" Mar 19 12:15:49.846032 master-0 kubenswrapper[29612]: I0319 12:15:49.845979 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 12:15:49.922676 master-0 kubenswrapper[29612]: I0319 12:15:49.921271 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 12:15:49.937810 master-0 kubenswrapper[29612]: I0319 12:15:49.937757 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 12:15:49.947428 master-0 kubenswrapper[29612]: I0319 12:15:49.947380 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 12:15:49.981900 master-0 kubenswrapper[29612]: I0319 12:15:49.981753 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 12:15:49.993701 master-0 kubenswrapper[29612]: I0319 12:15:49.993609 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 12:15:49.994350 master-0 kubenswrapper[29612]: I0319 12:15:49.994309 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 12:15:49.999691 master-0 kubenswrapper[29612]: I0319 12:15:49.999588 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 12:15:50.043609 master-0 kubenswrapper[29612]: I0319 12:15:50.043532 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 19 12:15:50.069133 master-0 kubenswrapper[29612]: I0319 12:15:50.069077 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 12:15:50.126402 master-0 kubenswrapper[29612]: I0319 12:15:50.126333 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 12:15:50.139479 master-0 kubenswrapper[29612]: I0319 12:15:50.139431 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 12:15:50.176834 master-0 kubenswrapper[29612]: I0319 12:15:50.176778 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 12:15:50.423327 master-0 kubenswrapper[29612]: I0319 12:15:50.423254 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 19 12:15:50.428413 master-0 kubenswrapper[29612]: I0319 12:15:50.428340 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:15:50.443516 master-0 kubenswrapper[29612]: I0319 12:15:50.443459 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2dl2n" Mar 19 12:15:50.445502 master-0 kubenswrapper[29612]: I0319 12:15:50.445451 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 12:15:50.461015 master-0 kubenswrapper[29612]: I0319 12:15:50.460937 29612 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 12:15:50.492343 master-0 kubenswrapper[29612]: I0319 12:15:50.492283 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 12:15:50.507375 master-0 kubenswrapper[29612]: I0319 12:15:50.507313 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 19 12:15:50.527604 master-0 kubenswrapper[29612]: I0319 12:15:50.527534 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 12:15:50.583372 master-0 kubenswrapper[29612]: I0319 12:15:50.583287 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 12:15:50.612066 master-0 kubenswrapper[29612]: I0319 12:15:50.612002 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 12:15:50.706736 master-0 kubenswrapper[29612]: I0319 12:15:50.706618 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 12:15:50.711970 master-0 kubenswrapper[29612]: I0319 12:15:50.711923 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 12:15:50.712842 master-0 kubenswrapper[29612]: I0319 12:15:50.712823 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 12:15:50.728753 master-0 kubenswrapper[29612]: I0319 12:15:50.728688 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 12:15:50.737346 master-0 kubenswrapper[29612]: I0319 12:15:50.737311 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 12:15:50.756180 master-0 kubenswrapper[29612]: I0319 12:15:50.756129 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 12:15:50.814867 master-0 kubenswrapper[29612]: I0319 12:15:50.814758 29612 patch_prober.go:28] interesting pod/console-bfc49dbf9-mppbq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" start-of-body= Mar 19 12:15:50.814867 master-0 kubenswrapper[29612]: I0319 12:15:50.814844 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" probeResult="failure" output="Get \"https://10.128.0.107:8443/health\": dial tcp 10.128.0.107:8443: connect: connection refused" Mar 19 12:15:50.877971 master-0 kubenswrapper[29612]: I0319 12:15:50.877926 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-nkt4p" Mar 19 12:15:50.926802 master-0 kubenswrapper[29612]: I0319 12:15:50.926699 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-825dz" Mar 19 12:15:50.966743 master-0 kubenswrapper[29612]: I0319 12:15:50.966258 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 19 12:15:50.966743 master-0 kubenswrapper[29612]: I0319 12:15:50.966620 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 19 12:15:50.992278 master-0 kubenswrapper[29612]: I0319 12:15:50.992204 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 12:15:51.100150 master-0 kubenswrapper[29612]: I0319 12:15:51.100096 29612 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 19 12:15:51.100357 master-0 kubenswrapper[29612]: I0319 12:15:51.100332 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" containerID="cri-o://21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d" gracePeriod=5 Mar 19 12:15:51.207229 master-0 kubenswrapper[29612]: I0319 12:15:51.207165 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 12:15:51.222385 master-0 kubenswrapper[29612]: I0319 12:15:51.222269 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 12:15:51.260304 master-0 kubenswrapper[29612]: I0319 12:15:51.260244 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:15:51.265064 master-0 kubenswrapper[29612]: I0319 12:15:51.265015 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 12:15:51.319395 master-0 kubenswrapper[29612]: I0319 12:15:51.319339 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 12:15:51.329658 master-0 kubenswrapper[29612]: I0319 12:15:51.329600 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 12:15:51.339859 master-0 kubenswrapper[29612]: I0319 12:15:51.339818 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-d7x5g" Mar 19 12:15:51.359077 master-0 kubenswrapper[29612]: I0319 12:15:51.358988 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 12:15:51.506765 master-0 kubenswrapper[29612]: I0319 12:15:51.506543 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 12:15:51.512801 master-0 kubenswrapper[29612]: I0319 12:15:51.512621 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 12:15:51.520419 master-0 kubenswrapper[29612]: I0319 12:15:51.520340 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 19 12:15:51.549358 master-0 kubenswrapper[29612]: I0319 12:15:51.549274 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 12:15:51.586384 master-0 kubenswrapper[29612]: I0319 12:15:51.586329 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 12:15:51.589237 master-0 kubenswrapper[29612]: I0319 12:15:51.589193 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 12:15:51.590844 master-0 kubenswrapper[29612]: I0319 12:15:51.590812 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 12:15:51.634654 master-0 kubenswrapper[29612]: I0319 12:15:51.634578 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 19 12:15:51.636227 master-0 kubenswrapper[29612]: I0319 12:15:51.636190 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 12:15:51.665056 master-0 kubenswrapper[29612]: I0319 12:15:51.664990 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-62ztt" Mar 19 12:15:51.693658 master-0 kubenswrapper[29612]: I0319 12:15:51.693594 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 12:15:51.752013 master-0 kubenswrapper[29612]: I0319 12:15:51.751934 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 12:15:51.851737 master-0 kubenswrapper[29612]: I0319 12:15:51.851593 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 12:15:51.871291 master-0 kubenswrapper[29612]: I0319 12:15:51.871222 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-v6j8n" Mar 19 12:15:51.988774 master-0 kubenswrapper[29612]: I0319 12:15:51.988725 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 12:15:52.013049 master-0 kubenswrapper[29612]: I0319 12:15:52.012101 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 12:15:52.048481 master-0 kubenswrapper[29612]: I0319 12:15:52.048419 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 12:15:52.085675 master-0 kubenswrapper[29612]: I0319 12:15:52.085541 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 12:15:52.090368 master-0 kubenswrapper[29612]: I0319 12:15:52.090307 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 12:15:52.266512 master-0 kubenswrapper[29612]: I0319 12:15:52.266450 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 12:15:52.297887 master-0 kubenswrapper[29612]: I0319 12:15:52.297838 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 12:15:52.411104 master-0 kubenswrapper[29612]: I0319 12:15:52.410692 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 12:15:52.428970 master-0 kubenswrapper[29612]: I0319 12:15:52.428938 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 12:15:52.433700 master-0 kubenswrapper[29612]: I0319 12:15:52.433654 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-dkps1gqvgkpdc" Mar 19 12:15:52.434972 master-0 kubenswrapper[29612]: I0319 12:15:52.434906 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 12:15:52.637574 master-0 kubenswrapper[29612]: I0319 12:15:52.637451 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-qzhdr" Mar 19 12:15:52.686368 master-0 kubenswrapper[29612]: I0319 12:15:52.686292 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-h5fcf" Mar 19 12:15:52.729048 master-0 kubenswrapper[29612]: I0319 12:15:52.728997 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 12:15:52.931057 master-0 kubenswrapper[29612]: I0319 12:15:52.931010 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 12:15:52.968496 master-0 kubenswrapper[29612]: I0319 12:15:52.968448 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-m26t4" Mar 19 12:15:52.984425 master-0 kubenswrapper[29612]: I0319 12:15:52.984377 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 12:15:53.011555 master-0 kubenswrapper[29612]: I0319 12:15:53.011500 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 12:15:53.028815 master-0 kubenswrapper[29612]: I0319 12:15:53.028772 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 12:15:53.053667 master-0 kubenswrapper[29612]: I0319 12:15:53.053614 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 12:15:53.068083 master-0 kubenswrapper[29612]: I0319 12:15:53.068029 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 12:15:53.292687 master-0 kubenswrapper[29612]: I0319 12:15:53.292562 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 12:15:53.312163 master-0 kubenswrapper[29612]: I0319 12:15:53.312115 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 12:15:53.358415 master-0 kubenswrapper[29612]: I0319 12:15:53.358355 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-nbxfb" Mar 19 12:15:53.412818 master-0 kubenswrapper[29612]: I0319 12:15:53.412763 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 12:15:53.475320 master-0 kubenswrapper[29612]: I0319 12:15:53.475271 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 12:15:53.511203 master-0 kubenswrapper[29612]: I0319 12:15:53.511162 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 12:15:53.547794 master-0 kubenswrapper[29612]: I0319 12:15:53.547677 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 19 12:15:53.616098 master-0 kubenswrapper[29612]: I0319 12:15:53.616064 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-f27d9" Mar 19 12:15:53.658850 master-0 kubenswrapper[29612]: I0319 12:15:53.658801 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-nghrc" Mar 19 12:15:53.675442 master-0 kubenswrapper[29612]: I0319 12:15:53.675390 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-xsv5c" Mar 19 12:15:53.676895 master-0 kubenswrapper[29612]: I0319 12:15:53.676871 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 12:15:53.873330 master-0 kubenswrapper[29612]: I0319 12:15:53.873225 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 12:15:53.887797 master-0 kubenswrapper[29612]: I0319 12:15:53.887734 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 12:15:53.893925 master-0 kubenswrapper[29612]: I0319 12:15:53.893870 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 12:15:54.079366 master-0 kubenswrapper[29612]: I0319 12:15:54.079274 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-k64d2" Mar 19 12:15:54.127960 master-0 kubenswrapper[29612]: I0319 12:15:54.127836 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 12:15:54.200829 master-0 kubenswrapper[29612]: I0319 12:15:54.200774 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 12:15:54.238295 master-0 kubenswrapper[29612]: I0319 12:15:54.238240 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 12:15:54.310330 master-0 kubenswrapper[29612]: I0319 12:15:54.310273 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 12:15:54.341233 master-0 kubenswrapper[29612]: I0319 12:15:54.341081 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 12:15:54.543274 master-0 kubenswrapper[29612]: I0319 12:15:54.543210 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 12:15:54.543518 master-0 kubenswrapper[29612]: I0319 12:15:54.543430 29612 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 12:15:54.581050 master-0 kubenswrapper[29612]: I0319 12:15:54.580986 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:15:54.676028 master-0 kubenswrapper[29612]: I0319 12:15:54.675965 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 19 12:15:54.830447 master-0 kubenswrapper[29612]: I0319 12:15:54.830314 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 19 12:15:54.942600 master-0 kubenswrapper[29612]: I0319 12:15:54.942521 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 19 12:15:55.131417 master-0 kubenswrapper[29612]: I0319 12:15:55.131231 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 19 12:15:55.188199 master-0 kubenswrapper[29612]: I0319 12:15:55.188097 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 12:15:55.212508 master-0 kubenswrapper[29612]: I0319 12:15:55.212439 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2pvrw" Mar 19 12:15:55.271972 master-0 kubenswrapper[29612]: I0319 12:15:55.271914 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 12:15:55.395606 master-0 kubenswrapper[29612]: I0319 12:15:55.395482 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 12:15:55.456010 master-0 kubenswrapper[29612]: I0319 12:15:55.455945 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 12:15:55.458601 master-0 kubenswrapper[29612]: I0319 12:15:55.458560 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 19 12:15:55.594222 master-0 kubenswrapper[29612]: I0319 12:15:55.594165 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 12:15:55.687595 master-0 kubenswrapper[29612]: I0319 12:15:55.687477 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 12:15:55.789477 master-0 kubenswrapper[29612]: I0319 12:15:55.789400 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 12:15:55.801308 master-0 kubenswrapper[29612]: I0319 12:15:55.801228 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 12:15:56.110520 master-0 kubenswrapper[29612]: I0319 12:15:56.110451 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 12:15:56.233238 master-0 kubenswrapper[29612]: I0319 12:15:56.233156 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 12:15:56.365068 master-0 kubenswrapper[29612]: I0319 12:15:56.364921 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 19 12:15:56.390519 master-0 kubenswrapper[29612]: I0319 12:15:56.390443 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-7v2tf" Mar 19 12:15:56.440993 master-0 kubenswrapper[29612]: I0319 12:15:56.440918 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 12:15:56.565804 master-0 kubenswrapper[29612]: I0319 12:15:56.565726 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 12:15:56.641091 master-0 kubenswrapper[29612]: I0319 12:15:56.641041 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-8zr85" Mar 19 12:15:56.681758 master-0 kubenswrapper[29612]: I0319 12:15:56.681710 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_ebbfbf2b56df0323ba118d68bfdad8b9/startup-monitor/0.log" Mar 19 12:15:56.681942 master-0 kubenswrapper[29612]: I0319 12:15:56.681795 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:56.769182 master-0 kubenswrapper[29612]: I0319 12:15:56.769129 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769190 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769215 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log" (OuterVolumeSpecName: "var-log") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769246 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769296 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769319 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769340 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") pod \"ebbfbf2b56df0323ba118d68bfdad8b9\" (UID: \"ebbfbf2b56df0323ba118d68bfdad8b9\") " Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769345 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests" (OuterVolumeSpecName: "manifests") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:56.769655 master-0 kubenswrapper[29612]: I0319 12:15:56.769444 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock" (OuterVolumeSpecName: "var-lock") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:56.769992 master-0 kubenswrapper[29612]: I0319 12:15:56.769965 29612 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-log\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:56.770027 master-0 kubenswrapper[29612]: I0319 12:15:56.769991 29612 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:56.770027 master-0 kubenswrapper[29612]: I0319 12:15:56.770007 29612 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-manifests\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:56.770027 master-0 kubenswrapper[29612]: I0319 12:15:56.770021 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:56.776764 master-0 kubenswrapper[29612]: I0319 12:15:56.776713 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "ebbfbf2b56df0323ba118d68bfdad8b9" (UID: "ebbfbf2b56df0323ba118d68bfdad8b9"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:15:56.871215 master-0 kubenswrapper[29612]: I0319 12:15:56.871156 29612 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebbfbf2b56df0323ba118d68bfdad8b9-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:15:56.918771 master-0 kubenswrapper[29612]: I0319 12:15:56.918625 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" path="/var/lib/kubelet/pods/ebbfbf2b56df0323ba118d68bfdad8b9/volumes" Mar 19 12:15:56.977157 master-0 kubenswrapper[29612]: I0319 12:15:56.977102 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_ebbfbf2b56df0323ba118d68bfdad8b9/startup-monitor/0.log" Mar 19 12:15:56.977157 master-0 kubenswrapper[29612]: I0319 12:15:56.977153 29612 generic.go:334] "Generic (PLEG): container finished" podID="ebbfbf2b56df0323ba118d68bfdad8b9" containerID="21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d" exitCode=137 Mar 19 12:15:56.977544 master-0 kubenswrapper[29612]: I0319 12:15:56.977195 29612 scope.go:117] "RemoveContainer" containerID="21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d" Mar 19 12:15:56.977544 master-0 kubenswrapper[29612]: I0319 12:15:56.977292 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 19 12:15:56.992807 master-0 kubenswrapper[29612]: I0319 12:15:56.992676 29612 scope.go:117] "RemoveContainer" containerID="21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d" Mar 19 12:15:56.993021 master-0 kubenswrapper[29612]: E0319 12:15:56.992976 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d\": container with ID starting with 21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d not found: ID does not exist" containerID="21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d" Mar 19 12:15:56.993021 master-0 kubenswrapper[29612]: I0319 12:15:56.993004 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d"} err="failed to get container status \"21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d\": rpc error: code = NotFound desc = could not find container \"21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d\": container with ID starting with 21724870021cf363e28ec2f0c807cb92d03bc6801994408d6ac7de9be8ae3a5d not found: ID does not exist" Mar 19 12:15:57.034652 master-0 kubenswrapper[29612]: I0319 12:15:57.034574 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 12:15:59.427726 master-0 kubenswrapper[29612]: I0319 12:15:59.427662 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:15:59.431373 master-0 kubenswrapper[29612]: I0319 12:15:59.431340 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:15:59.538423 master-0 kubenswrapper[29612]: I0319 12:15:59.538380 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bfc49dbf9-mppbq"] Mar 19 12:16:24.580865 master-0 kubenswrapper[29612]: I0319 12:16:24.580696 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-bfc49dbf9-mppbq" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" containerID="cri-o://970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40" gracePeriod=15 Mar 19 12:16:25.049721 master-0 kubenswrapper[29612]: I0319 12:16:25.049670 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bfc49dbf9-mppbq_9059b576-a287-4e13-9ae3-bb6ff70bc5b9/console/0.log" Mar 19 12:16:25.049929 master-0 kubenswrapper[29612]: I0319 12:16:25.049761 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:16:25.127151 master-0 kubenswrapper[29612]: I0319 12:16:25.127090 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-config\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.127151 master-0 kubenswrapper[29612]: I0319 12:16:25.127162 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-service-ca\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.127435 master-0 kubenswrapper[29612]: I0319 12:16:25.127188 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw9p9\" (UniqueName: \"kubernetes.io/projected/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-kube-api-access-kw9p9\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.127435 master-0 kubenswrapper[29612]: I0319 12:16:25.127229 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-serving-cert\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.127435 master-0 kubenswrapper[29612]: I0319 12:16:25.127251 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-trusted-ca-bundle\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.127435 master-0 kubenswrapper[29612]: I0319 12:16:25.127287 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-oauth-config\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.127435 master-0 kubenswrapper[29612]: I0319 12:16:25.127341 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-oauth-serving-cert\") pod \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\" (UID: \"9059b576-a287-4e13-9ae3-bb6ff70bc5b9\") " Mar 19 12:16:25.128154 master-0 kubenswrapper[29612]: I0319 12:16:25.128057 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:16:25.128386 master-0 kubenswrapper[29612]: I0319 12:16:25.128360 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-config" (OuterVolumeSpecName: "console-config") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:16:25.129396 master-0 kubenswrapper[29612]: I0319 12:16:25.129354 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-service-ca" (OuterVolumeSpecName: "service-ca") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:16:25.129476 master-0 kubenswrapper[29612]: I0319 12:16:25.128861 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:16:25.132737 master-0 kubenswrapper[29612]: I0319 12:16:25.132689 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:16:25.132908 master-0 kubenswrapper[29612]: I0319 12:16:25.132812 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:16:25.133136 master-0 kubenswrapper[29612]: I0319 12:16:25.133047 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-kube-api-access-kw9p9" (OuterVolumeSpecName: "kube-api-access-kw9p9") pod "9059b576-a287-4e13-9ae3-bb6ff70bc5b9" (UID: "9059b576-a287-4e13-9ae3-bb6ff70bc5b9"). InnerVolumeSpecName "kube-api-access-kw9p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:16:25.199959 master-0 kubenswrapper[29612]: I0319 12:16:25.199916 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bfc49dbf9-mppbq_9059b576-a287-4e13-9ae3-bb6ff70bc5b9/console/0.log" Mar 19 12:16:25.200279 master-0 kubenswrapper[29612]: I0319 12:16:25.200249 29612 generic.go:334] "Generic (PLEG): container finished" podID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerID="970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40" exitCode=2 Mar 19 12:16:25.200372 master-0 kubenswrapper[29612]: I0319 12:16:25.200316 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfc49dbf9-mppbq" event={"ID":"9059b576-a287-4e13-9ae3-bb6ff70bc5b9","Type":"ContainerDied","Data":"970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40"} Mar 19 12:16:25.200506 master-0 kubenswrapper[29612]: I0319 12:16:25.200483 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfc49dbf9-mppbq" event={"ID":"9059b576-a287-4e13-9ae3-bb6ff70bc5b9","Type":"ContainerDied","Data":"8cd7f53bd5210c62b82fd304e867fc2e55c645afc3b1f06b30acfcf2ae75aacd"} Mar 19 12:16:25.200654 master-0 kubenswrapper[29612]: I0319 12:16:25.200616 29612 scope.go:117] "RemoveContainer" containerID="970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40" Mar 19 12:16:25.200825 master-0 kubenswrapper[29612]: I0319 12:16:25.200413 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfc49dbf9-mppbq" Mar 19 12:16:25.218021 master-0 kubenswrapper[29612]: I0319 12:16:25.217764 29612 scope.go:117] "RemoveContainer" containerID="970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40" Mar 19 12:16:25.218825 master-0 kubenswrapper[29612]: E0319 12:16:25.218780 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40\": container with ID starting with 970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40 not found: ID does not exist" containerID="970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40" Mar 19 12:16:25.218913 master-0 kubenswrapper[29612]: I0319 12:16:25.218824 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40"} err="failed to get container status \"970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40\": rpc error: code = NotFound desc = could not find container \"970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40\": container with ID starting with 970785ae894c683759230cc0e8ba5635959f73c72c7b18f27f9913878acd0f40 not found: ID does not exist" Mar 19 12:16:25.230211 master-0 kubenswrapper[29612]: I0319 12:16:25.230160 29612 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.230211 master-0 kubenswrapper[29612]: I0319 12:16:25.230203 29612 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.230211 master-0 kubenswrapper[29612]: I0319 12:16:25.230215 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw9p9\" (UniqueName: \"kubernetes.io/projected/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-kube-api-access-kw9p9\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.230379 master-0 kubenswrapper[29612]: I0319 12:16:25.230228 29612 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.230379 master-0 kubenswrapper[29612]: I0319 12:16:25.230239 29612 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.230379 master-0 kubenswrapper[29612]: I0319 12:16:25.230248 29612 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.230379 master-0 kubenswrapper[29612]: I0319 12:16:25.230257 29612 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9059b576-a287-4e13-9ae3-bb6ff70bc5b9-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:16:25.250890 master-0 kubenswrapper[29612]: I0319 12:16:25.250777 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bfc49dbf9-mppbq"] Mar 19 12:16:25.256077 master-0 kubenswrapper[29612]: I0319 12:16:25.256025 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bfc49dbf9-mppbq"] Mar 19 12:16:26.916779 master-0 kubenswrapper[29612]: I0319 12:16:26.916692 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" path="/var/lib/kubelet/pods/9059b576-a287-4e13-9ae3-bb6ff70bc5b9/volumes" Mar 19 12:17:33.298842 master-0 kubenswrapper[29612]: I0319 12:17:33.298778 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: E0319 12:17:33.299062 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" containerName="metrics-server" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299076 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" containerName="metrics-server" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: E0319 12:17:33.299086 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" containerName="installer" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299092 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" containerName="installer" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: E0319 12:17:33.299104 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299114 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: E0319 12:17:33.299134 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299141 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299279 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac8d080-b6a9-45f0-97ba-d6fa93e58c5c" containerName="metrics-server" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299299 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe27a308-dd0c-4e7a-9d49-a4dc112a80bf" containerName="installer" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299315 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="9059b576-a287-4e13-9ae3-bb6ff70bc5b9" containerName="console" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299324 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbfbf2b56df0323ba118d68bfdad8b9" containerName="startup-monitor" Mar 19 12:17:33.299891 master-0 kubenswrapper[29612]: I0319 12:17:33.299758 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.302051 master-0 kubenswrapper[29612]: I0319 12:17:33.302024 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-w4vjc" Mar 19 12:17:33.302252 master-0 kubenswrapper[29612]: I0319 12:17:33.302023 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 12:17:33.381938 master-0 kubenswrapper[29612]: I0319 12:17:33.381876 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 19 12:17:33.462560 master-0 kubenswrapper[29612]: I0319 12:17:33.462494 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.462826 master-0 kubenswrapper[29612]: I0319 12:17:33.462598 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-var-lock\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.462826 master-0 kubenswrapper[29612]: I0319 12:17:33.462664 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.563698 master-0 kubenswrapper[29612]: I0319 12:17:33.563558 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.563698 master-0 kubenswrapper[29612]: I0319 12:17:33.563648 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-var-lock\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.564045 master-0 kubenswrapper[29612]: I0319 12:17:33.563718 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-var-lock\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.564045 master-0 kubenswrapper[29612]: I0319 12:17:33.563753 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.564045 master-0 kubenswrapper[29612]: I0319 12:17:33.563806 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.609381 master-0 kubenswrapper[29612]: I0319 12:17:33.609323 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kube-api-access\") pod \"installer-5-master-0\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:33.617392 master-0 kubenswrapper[29612]: I0319 12:17:33.617325 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:17:34.401217 master-0 kubenswrapper[29612]: I0319 12:17:34.401169 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 19 12:17:34.771149 master-0 kubenswrapper[29612]: I0319 12:17:34.771089 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4","Type":"ContainerStarted","Data":"33211ba261f17f5ba477c0d2f49abc962f46a9810aab252d78e89596b59ebe8d"} Mar 19 12:17:35.781314 master-0 kubenswrapper[29612]: I0319 12:17:35.781153 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4","Type":"ContainerStarted","Data":"053d374d3c02e9c9db6908f819f55714d3942780d1e058e52526e19d273a228f"} Mar 19 12:17:35.843470 master-0 kubenswrapper[29612]: I0319 12:17:35.843359 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=3.843335952 podStartE2EDuration="3.843335952s" podCreationTimestamp="2026-03-19 12:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:17:35.83862891 +0000 UTC m=+463.152797941" watchObservedRunningTime="2026-03-19 12:17:35.843335952 +0000 UTC m=+463.157504973" Mar 19 12:18:04.293275 master-0 kubenswrapper[29612]: I0319 12:18:04.293182 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-zm4xg"] Mar 19 12:18:04.294476 master-0 kubenswrapper[29612]: I0319 12:18:04.294312 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.296683 master-0 kubenswrapper[29612]: I0319 12:18:04.296624 29612 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 19 12:18:04.296910 master-0 kubenswrapper[29612]: I0319 12:18:04.296880 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 19 12:18:04.296910 master-0 kubenswrapper[29612]: I0319 12:18:04.296891 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 19 12:18:04.297140 master-0 kubenswrapper[29612]: I0319 12:18:04.297101 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 19 12:18:04.307567 master-0 kubenswrapper[29612]: I0319 12:18:04.307501 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-zm4xg"] Mar 19 12:18:04.402134 master-0 kubenswrapper[29612]: I0319 12:18:04.402052 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/c3a5a568-eb2b-4a52-a621-c73f895a1e54-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.402134 master-0 kubenswrapper[29612]: I0319 12:18:04.402121 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c3a5a568-eb2b-4a52-a621-c73f895a1e54-os-client-config\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.402393 master-0 kubenswrapper[29612]: I0319 12:18:04.402219 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27wnq\" (UniqueName: \"kubernetes.io/projected/c3a5a568-eb2b-4a52-a621-c73f895a1e54-kube-api-access-27wnq\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.504054 master-0 kubenswrapper[29612]: I0319 12:18:04.503982 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/c3a5a568-eb2b-4a52-a621-c73f895a1e54-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.504054 master-0 kubenswrapper[29612]: I0319 12:18:04.504065 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c3a5a568-eb2b-4a52-a621-c73f895a1e54-os-client-config\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.504429 master-0 kubenswrapper[29612]: I0319 12:18:04.504113 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27wnq\" (UniqueName: \"kubernetes.io/projected/c3a5a568-eb2b-4a52-a621-c73f895a1e54-kube-api-access-27wnq\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.505612 master-0 kubenswrapper[29612]: I0319 12:18:04.505570 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/c3a5a568-eb2b-4a52-a621-c73f895a1e54-sushy-emulator-config\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.509094 master-0 kubenswrapper[29612]: I0319 12:18:04.509064 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c3a5a568-eb2b-4a52-a621-c73f895a1e54-os-client-config\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.524999 master-0 kubenswrapper[29612]: I0319 12:18:04.524934 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27wnq\" (UniqueName: \"kubernetes.io/projected/c3a5a568-eb2b-4a52-a621-c73f895a1e54-kube-api-access-27wnq\") pod \"sushy-emulator-59477995f9-zm4xg\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:04.630680 master-0 kubenswrapper[29612]: I0319 12:18:04.630477 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:05.088048 master-0 kubenswrapper[29612]: I0319 12:18:05.086904 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-zm4xg"] Mar 19 12:18:05.093232 master-0 kubenswrapper[29612]: W0319 12:18:05.093164 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3a5a568_eb2b_4a52_a621_c73f895a1e54.slice/crio-e638bc223b528f8645cd38de14595384736b37ea4db7dd4b2b1ff3174ce0bf53 WatchSource:0}: Error finding container e638bc223b528f8645cd38de14595384736b37ea4db7dd4b2b1ff3174ce0bf53: Status 404 returned error can't find the container with id e638bc223b528f8645cd38de14595384736b37ea4db7dd4b2b1ff3174ce0bf53 Mar 19 12:18:05.095325 master-0 kubenswrapper[29612]: I0319 12:18:05.095305 29612 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:18:06.069750 master-0 kubenswrapper[29612]: I0319 12:18:06.069405 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" event={"ID":"c3a5a568-eb2b-4a52-a621-c73f895a1e54","Type":"ContainerStarted","Data":"e638bc223b528f8645cd38de14595384736b37ea4db7dd4b2b1ff3174ce0bf53"} Mar 19 12:18:07.266876 master-0 kubenswrapper[29612]: I0319 12:18:07.266515 29612 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:18:07.267531 master-0 kubenswrapper[29612]: I0319 12:18:07.266938 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://61ee084e983f277942e90341d08d9836dacf57a63fdca8d267638b871e818f19" gracePeriod=30 Mar 19 12:18:07.267531 master-0 kubenswrapper[29612]: I0319 12:18:07.267242 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" containerID="cri-o://56f3d6737f59d56669808e7d85a508e574b8afc56126aef4f40a1770b36e3019" gracePeriod=30 Mar 19 12:18:07.267531 master-0 kubenswrapper[29612]: I0319 12:18:07.267291 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://1044262ff415484653556cf7ff298d5c42e6df054a66a5ea6fc06f00ef84999d" gracePeriod=30 Mar 19 12:18:07.267531 master-0 kubenswrapper[29612]: I0319 12:18:07.267330 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" containerID="cri-o://85ed437129c34727828994007c76166519f0d49369f91dbf9d5e4dc1693434d5" gracePeriod=30 Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.268569 29612 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.268919 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.268932 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.268952 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.268959 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.268975 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.268983 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.268996 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-recovery-controller" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.269006 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-recovery-controller" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.269018 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.269024 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.269037 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.269043 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.269056 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.269062 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: E0319 12:18:07.269072 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.269077 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269198 master-0 kubenswrapper[29612]: I0319 12:18:07.269201 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269219 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269229 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269241 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-recovery-controller" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269253 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269265 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager-cert-syncer" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269276 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269284 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: E0319 12:18:07.269404 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269411 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="cluster-policy-controller" Mar 19 12:18:07.269935 master-0 kubenswrapper[29612]: I0319 12:18:07.269527 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerName="kube-controller-manager" Mar 19 12:18:07.349818 master-0 kubenswrapper[29612]: I0319 12:18:07.349751 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/54d0ecc3ba823ba5eae25f8a4ceed69a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"54d0ecc3ba823ba5eae25f8a4ceed69a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:07.350088 master-0 kubenswrapper[29612]: I0319 12:18:07.349829 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/54d0ecc3ba823ba5eae25f8a4ceed69a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"54d0ecc3ba823ba5eae25f8a4ceed69a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:07.451407 master-0 kubenswrapper[29612]: I0319 12:18:07.451324 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/54d0ecc3ba823ba5eae25f8a4ceed69a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"54d0ecc3ba823ba5eae25f8a4ceed69a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:07.451407 master-0 kubenswrapper[29612]: I0319 12:18:07.451410 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/54d0ecc3ba823ba5eae25f8a4ceed69a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"54d0ecc3ba823ba5eae25f8a4ceed69a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:07.451764 master-0 kubenswrapper[29612]: I0319 12:18:07.451541 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/54d0ecc3ba823ba5eae25f8a4ceed69a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"54d0ecc3ba823ba5eae25f8a4ceed69a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:07.451764 master-0 kubenswrapper[29612]: I0319 12:18:07.451596 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/54d0ecc3ba823ba5eae25f8a4ceed69a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"54d0ecc3ba823ba5eae25f8a4ceed69a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:08.085991 master-0 kubenswrapper[29612]: I0319 12:18:08.085936 29612 generic.go:334] "Generic (PLEG): container finished" podID="d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" containerID="053d374d3c02e9c9db6908f819f55714d3942780d1e058e52526e19d273a228f" exitCode=0 Mar 19 12:18:08.086364 master-0 kubenswrapper[29612]: I0319 12:18:08.086016 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4","Type":"ContainerDied","Data":"053d374d3c02e9c9db6908f819f55714d3942780d1e058e52526e19d273a228f"} Mar 19 12:18:08.088965 master-0 kubenswrapper[29612]: I0319 12:18:08.088881 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager/2.log" Mar 19 12:18:08.089868 master-0 kubenswrapper[29612]: I0319 12:18:08.089832 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="91ced7d9a1e8a2019ac2b6a262f0d19e" podUID="54d0ecc3ba823ba5eae25f8a4ceed69a" Mar 19 12:18:08.089941 master-0 kubenswrapper[29612]: I0319 12:18:08.089859 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/1.log" Mar 19 12:18:08.090750 master-0 kubenswrapper[29612]: I0319 12:18:08.090722 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:18:08.091682 master-0 kubenswrapper[29612]: I0319 12:18:08.091627 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:18:08.091749 master-0 kubenswrapper[29612]: I0319 12:18:08.091707 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="56f3d6737f59d56669808e7d85a508e574b8afc56126aef4f40a1770b36e3019" exitCode=0 Mar 19 12:18:08.091784 master-0 kubenswrapper[29612]: I0319 12:18:08.091753 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="1044262ff415484653556cf7ff298d5c42e6df054a66a5ea6fc06f00ef84999d" exitCode=2 Mar 19 12:18:08.091784 master-0 kubenswrapper[29612]: I0319 12:18:08.091764 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="85ed437129c34727828994007c76166519f0d49369f91dbf9d5e4dc1693434d5" exitCode=0 Mar 19 12:18:08.091784 master-0 kubenswrapper[29612]: I0319 12:18:08.091771 29612 generic.go:334] "Generic (PLEG): container finished" podID="91ced7d9a1e8a2019ac2b6a262f0d19e" containerID="61ee084e983f277942e90341d08d9836dacf57a63fdca8d267638b871e818f19" exitCode=0 Mar 19 12:18:08.091868 master-0 kubenswrapper[29612]: I0319 12:18:08.091837 29612 scope.go:117] "RemoveContainer" containerID="617ff00631713680d2e186b6f3e8d34195bcb762d9f9e3d35f2fc08ecebe5a13" Mar 19 12:18:08.290913 master-0 kubenswrapper[29612]: I0319 12:18:08.290853 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/1.log" Mar 19 12:18:08.291977 master-0 kubenswrapper[29612]: I0319 12:18:08.291660 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:18:08.292770 master-0 kubenswrapper[29612]: I0319 12:18:08.292742 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/0.log" Mar 19 12:18:08.292847 master-0 kubenswrapper[29612]: I0319 12:18:08.292834 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:08.296439 master-0 kubenswrapper[29612]: I0319 12:18:08.296328 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="91ced7d9a1e8a2019ac2b6a262f0d19e" podUID="54d0ecc3ba823ba5eae25f8a4ceed69a" Mar 19 12:18:08.407428 master-0 kubenswrapper[29612]: I0319 12:18:08.406861 29612 scope.go:117] "RemoveContainer" containerID="79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b" Mar 19 12:18:08.468484 master-0 kubenswrapper[29612]: I0319 12:18:08.468436 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") pod \"91ced7d9a1e8a2019ac2b6a262f0d19e\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " Mar 19 12:18:08.468761 master-0 kubenswrapper[29612]: I0319 12:18:08.468526 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") pod \"91ced7d9a1e8a2019ac2b6a262f0d19e\" (UID: \"91ced7d9a1e8a2019ac2b6a262f0d19e\") " Mar 19 12:18:08.468953 master-0 kubenswrapper[29612]: I0319 12:18:08.468879 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "91ced7d9a1e8a2019ac2b6a262f0d19e" (UID: "91ced7d9a1e8a2019ac2b6a262f0d19e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:18:08.469027 master-0 kubenswrapper[29612]: I0319 12:18:08.468971 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "91ced7d9a1e8a2019ac2b6a262f0d19e" (UID: "91ced7d9a1e8a2019ac2b6a262f0d19e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:18:08.469154 master-0 kubenswrapper[29612]: I0319 12:18:08.469128 29612 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:18:08.469154 master-0 kubenswrapper[29612]: I0319 12:18:08.469152 29612 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/91ced7d9a1e8a2019ac2b6a262f0d19e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:18:08.917196 master-0 kubenswrapper[29612]: I0319 12:18:08.917136 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91ced7d9a1e8a2019ac2b6a262f0d19e" path="/var/lib/kubelet/pods/91ced7d9a1e8a2019ac2b6a262f0d19e/volumes" Mar 19 12:18:09.101936 master-0 kubenswrapper[29612]: I0319 12:18:09.101872 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/kube-controller-manager-cert-syncer/1.log" Mar 19 12:18:09.102731 master-0 kubenswrapper[29612]: I0319 12:18:09.102710 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_91ced7d9a1e8a2019ac2b6a262f0d19e/cluster-policy-controller/4.log" Mar 19 12:18:09.103524 master-0 kubenswrapper[29612]: I0319 12:18:09.103498 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:09.108506 master-0 kubenswrapper[29612]: I0319 12:18:09.108464 29612 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="91ced7d9a1e8a2019ac2b6a262f0d19e" podUID="54d0ecc3ba823ba5eae25f8a4ceed69a" Mar 19 12:18:11.041516 master-0 kubenswrapper[29612]: I0319 12:18:11.041477 29612 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:18:11.090240 master-0 kubenswrapper[29612]: I0319 12:18:11.090111 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:18:11.098072 master-0 kubenswrapper[29612]: I0319 12:18:11.098033 29612 scope.go:117] "RemoveContainer" containerID="56f3d6737f59d56669808e7d85a508e574b8afc56126aef4f40a1770b36e3019" Mar 19 12:18:11.134337 master-0 kubenswrapper[29612]: I0319 12:18:11.133910 29612 scope.go:117] "RemoveContainer" containerID="1044262ff415484653556cf7ff298d5c42e6df054a66a5ea6fc06f00ef84999d" Mar 19 12:18:11.138977 master-0 kubenswrapper[29612]: I0319 12:18:11.138900 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4","Type":"ContainerDied","Data":"33211ba261f17f5ba477c0d2f49abc962f46a9810aab252d78e89596b59ebe8d"} Mar 19 12:18:11.138977 master-0 kubenswrapper[29612]: I0319 12:18:11.138966 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33211ba261f17f5ba477c0d2f49abc962f46a9810aab252d78e89596b59ebe8d" Mar 19 12:18:11.139182 master-0 kubenswrapper[29612]: I0319 12:18:11.138922 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 19 12:18:11.149685 master-0 kubenswrapper[29612]: I0319 12:18:11.149605 29612 scope.go:117] "RemoveContainer" containerID="85ed437129c34727828994007c76166519f0d49369f91dbf9d5e4dc1693434d5" Mar 19 12:18:11.167700 master-0 kubenswrapper[29612]: I0319 12:18:11.167620 29612 scope.go:117] "RemoveContainer" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:18:11.168396 master-0 kubenswrapper[29612]: E0319 12:18:11.168311 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9\": container with ID starting with f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9 not found: ID does not exist" containerID="f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9" Mar 19 12:18:11.168470 master-0 kubenswrapper[29612]: I0319 12:18:11.168424 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9"} err="failed to get container status \"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9\": rpc error: code = NotFound desc = could not find container \"f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9\": container with ID starting with f04f5379dbccf525e00cb5b1107f1853912f583f42fad5f651f5c56a205082b9 not found: ID does not exist" Mar 19 12:18:11.168529 master-0 kubenswrapper[29612]: I0319 12:18:11.168484 29612 scope.go:117] "RemoveContainer" containerID="61ee084e983f277942e90341d08d9836dacf57a63fdca8d267638b871e818f19" Mar 19 12:18:11.192279 master-0 kubenswrapper[29612]: I0319 12:18:11.192216 29612 scope.go:117] "RemoveContainer" containerID="79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b" Mar 19 12:18:11.192837 master-0 kubenswrapper[29612]: E0319 12:18:11.192792 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b\": container with ID starting with 79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b not found: ID does not exist" containerID="79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b" Mar 19 12:18:11.192907 master-0 kubenswrapper[29612]: I0319 12:18:11.192838 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b"} err="failed to get container status \"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b\": rpc error: code = NotFound desc = could not find container \"79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b\": container with ID starting with 79329c4a02df218538aac901f005c4f4ee6490c1dd50decdc7ba987796d4758b not found: ID does not exist" Mar 19 12:18:11.223024 master-0 kubenswrapper[29612]: I0319 12:18:11.222940 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-var-lock\") pod \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " Mar 19 12:18:11.223024 master-0 kubenswrapper[29612]: I0319 12:18:11.223005 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kubelet-dir\") pod \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " Mar 19 12:18:11.223024 master-0 kubenswrapper[29612]: I0319 12:18:11.223037 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kube-api-access\") pod \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\" (UID: \"d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4\") " Mar 19 12:18:11.223389 master-0 kubenswrapper[29612]: I0319 12:18:11.223067 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-var-lock" (OuterVolumeSpecName: "var-lock") pod "d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" (UID: "d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:18:11.223389 master-0 kubenswrapper[29612]: I0319 12:18:11.223105 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" (UID: "d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:18:11.223525 master-0 kubenswrapper[29612]: I0319 12:18:11.223484 29612 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 19 12:18:11.223525 master-0 kubenswrapper[29612]: I0319 12:18:11.223511 29612 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:18:11.226479 master-0 kubenswrapper[29612]: I0319 12:18:11.226437 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" (UID: "d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:18:11.325522 master-0 kubenswrapper[29612]: I0319 12:18:11.325467 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 19 12:18:12.153034 master-0 kubenswrapper[29612]: I0319 12:18:12.152140 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" event={"ID":"c3a5a568-eb2b-4a52-a621-c73f895a1e54","Type":"ContainerStarted","Data":"680ec5360c00bf7ffb0bd302267c5a0bc607c5dd91ff24db48f5e132e7d2938d"} Mar 19 12:18:12.173378 master-0 kubenswrapper[29612]: I0319 12:18:12.173274 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" podStartSLOduration=2.11402261 podStartE2EDuration="8.173260356s" podCreationTimestamp="2026-03-19 12:18:04 +0000 UTC" firstStartedPulling="2026-03-19 12:18:05.095274747 +0000 UTC m=+492.409443768" lastFinishedPulling="2026-03-19 12:18:11.154512493 +0000 UTC m=+498.468681514" observedRunningTime="2026-03-19 12:18:12.172147525 +0000 UTC m=+499.486316546" watchObservedRunningTime="2026-03-19 12:18:12.173260356 +0000 UTC m=+499.487429377" Mar 19 12:18:14.631476 master-0 kubenswrapper[29612]: I0319 12:18:14.631365 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:14.631476 master-0 kubenswrapper[29612]: I0319 12:18:14.631485 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:14.643653 master-0 kubenswrapper[29612]: I0319 12:18:14.643532 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:15.191515 master-0 kubenswrapper[29612]: I0319 12:18:15.191438 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:18:18.906228 master-0 kubenswrapper[29612]: I0319 12:18:18.906136 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:18.930040 master-0 kubenswrapper[29612]: I0319 12:18:18.929964 29612 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="bcbac175-57b6-41ca-a0be-475d81e87835" Mar 19 12:18:18.930040 master-0 kubenswrapper[29612]: I0319 12:18:18.930024 29612 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="bcbac175-57b6-41ca-a0be-475d81e87835" Mar 19 12:18:18.949411 master-0 kubenswrapper[29612]: I0319 12:18:18.949301 29612 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:18.962387 master-0 kubenswrapper[29612]: I0319 12:18:18.960518 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:18:18.968822 master-0 kubenswrapper[29612]: I0319 12:18:18.968747 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:18.983951 master-0 kubenswrapper[29612]: I0319 12:18:18.983869 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:18:18.991327 master-0 kubenswrapper[29612]: I0319 12:18:18.991246 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 19 12:18:19.001903 master-0 kubenswrapper[29612]: W0319 12:18:19.001815 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d0ecc3ba823ba5eae25f8a4ceed69a.slice/crio-fbc5017c0cfe258de2257f2eb49bb88a524032f6ed5dbdc21eb9ae9c61165b0b WatchSource:0}: Error finding container fbc5017c0cfe258de2257f2eb49bb88a524032f6ed5dbdc21eb9ae9c61165b0b: Status 404 returned error can't find the container with id fbc5017c0cfe258de2257f2eb49bb88a524032f6ed5dbdc21eb9ae9c61165b0b Mar 19 12:18:19.256718 master-0 kubenswrapper[29612]: I0319 12:18:19.255687 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"54d0ecc3ba823ba5eae25f8a4ceed69a","Type":"ContainerStarted","Data":"0c14cf854a414db5ec01a917a619cdb9188306493b1011194afc42d6a3cecb97"} Mar 19 12:18:19.256718 master-0 kubenswrapper[29612]: I0319 12:18:19.255746 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"54d0ecc3ba823ba5eae25f8a4ceed69a","Type":"ContainerStarted","Data":"fbc5017c0cfe258de2257f2eb49bb88a524032f6ed5dbdc21eb9ae9c61165b0b"} Mar 19 12:18:20.268402 master-0 kubenswrapper[29612]: I0319 12:18:20.268257 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"54d0ecc3ba823ba5eae25f8a4ceed69a","Type":"ContainerStarted","Data":"088957ce2c0e70ad95a4256751284125f7da10bf1a76f5c43f494a46869257fd"} Mar 19 12:18:20.268402 master-0 kubenswrapper[29612]: I0319 12:18:20.268361 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"54d0ecc3ba823ba5eae25f8a4ceed69a","Type":"ContainerStarted","Data":"fe70a7d0cd69755cc13c3a937725da2f6a98a3783c3536b6032e2a1551e7f037"} Mar 19 12:18:20.268402 master-0 kubenswrapper[29612]: I0319 12:18:20.268379 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"54d0ecc3ba823ba5eae25f8a4ceed69a","Type":"ContainerStarted","Data":"170d7db4eaa1d3fb33a8e2a1e03182efdb4c52d09a874e5879c791fd8ab6d879"} Mar 19 12:18:20.303553 master-0 kubenswrapper[29612]: I0319 12:18:20.303468 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.303449607 podStartE2EDuration="2.303449607s" podCreationTimestamp="2026-03-19 12:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:18:20.299925998 +0000 UTC m=+507.614095039" watchObservedRunningTime="2026-03-19 12:18:20.303449607 +0000 UTC m=+507.617618628" Mar 19 12:18:28.971095 master-0 kubenswrapper[29612]: I0319 12:18:28.970425 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:28.971095 master-0 kubenswrapper[29612]: I0319 12:18:28.970498 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:28.971095 master-0 kubenswrapper[29612]: I0319 12:18:28.970509 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:28.971095 master-0 kubenswrapper[29612]: I0319 12:18:28.970835 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:28.975345 master-0 kubenswrapper[29612]: I0319 12:18:28.975285 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:28.978049 master-0 kubenswrapper[29612]: I0319 12:18:28.977984 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:29.367279 master-0 kubenswrapper[29612]: I0319 12:18:29.367092 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:29.370385 master-0 kubenswrapper[29612]: I0319 12:18:29.370310 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 19 12:18:54.537189 master-0 kubenswrapper[29612]: I0319 12:18:54.537125 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-5898b679c9-rvl7w"] Mar 19 12:18:54.537789 master-0 kubenswrapper[29612]: E0319 12:18:54.537457 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" containerName="installer" Mar 19 12:18:54.537789 master-0 kubenswrapper[29612]: I0319 12:18:54.537472 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" containerName="installer" Mar 19 12:18:54.537789 master-0 kubenswrapper[29612]: I0319 12:18:54.537657 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8174ba3-e20a-4931-b1f9-2eb5ddd78dc4" containerName="installer" Mar 19 12:18:54.538446 master-0 kubenswrapper[29612]: I0319 12:18:54.538422 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.548675 master-0 kubenswrapper[29612]: I0319 12:18:54.548608 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-5898b679c9-rvl7w"] Mar 19 12:18:54.639897 master-0 kubenswrapper[29612]: I0319 12:18:54.639817 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrg2z\" (UniqueName: \"kubernetes.io/projected/a895ff85-6349-4399-9049-3f2cdac4a96e-kube-api-access-jrg2z\") pod \"nova-console-poller-5898b679c9-rvl7w\" (UID: \"a895ff85-6349-4399-9049-3f2cdac4a96e\") " pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.640164 master-0 kubenswrapper[29612]: I0319 12:18:54.640088 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/a895ff85-6349-4399-9049-3f2cdac4a96e-os-client-config\") pod \"nova-console-poller-5898b679c9-rvl7w\" (UID: \"a895ff85-6349-4399-9049-3f2cdac4a96e\") " pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.741034 master-0 kubenswrapper[29612]: I0319 12:18:54.740954 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/a895ff85-6349-4399-9049-3f2cdac4a96e-os-client-config\") pod \"nova-console-poller-5898b679c9-rvl7w\" (UID: \"a895ff85-6349-4399-9049-3f2cdac4a96e\") " pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.741311 master-0 kubenswrapper[29612]: I0319 12:18:54.741240 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrg2z\" (UniqueName: \"kubernetes.io/projected/a895ff85-6349-4399-9049-3f2cdac4a96e-kube-api-access-jrg2z\") pod \"nova-console-poller-5898b679c9-rvl7w\" (UID: \"a895ff85-6349-4399-9049-3f2cdac4a96e\") " pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.744743 master-0 kubenswrapper[29612]: I0319 12:18:54.744698 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/a895ff85-6349-4399-9049-3f2cdac4a96e-os-client-config\") pod \"nova-console-poller-5898b679c9-rvl7w\" (UID: \"a895ff85-6349-4399-9049-3f2cdac4a96e\") " pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.757619 master-0 kubenswrapper[29612]: I0319 12:18:54.757558 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrg2z\" (UniqueName: \"kubernetes.io/projected/a895ff85-6349-4399-9049-3f2cdac4a96e-kube-api-access-jrg2z\") pod \"nova-console-poller-5898b679c9-rvl7w\" (UID: \"a895ff85-6349-4399-9049-3f2cdac4a96e\") " pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:54.852847 master-0 kubenswrapper[29612]: I0319 12:18:54.852705 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" Mar 19 12:18:55.314928 master-0 kubenswrapper[29612]: W0319 12:18:55.314861 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda895ff85_6349_4399_9049_3f2cdac4a96e.slice/crio-61035833696b1187d3e771578f30db972a45b9e6eb6ed55b4dd43918ea8b9217 WatchSource:0}: Error finding container 61035833696b1187d3e771578f30db972a45b9e6eb6ed55b4dd43918ea8b9217: Status 404 returned error can't find the container with id 61035833696b1187d3e771578f30db972a45b9e6eb6ed55b4dd43918ea8b9217 Mar 19 12:18:55.327128 master-0 kubenswrapper[29612]: I0319 12:18:55.326455 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-5898b679c9-rvl7w"] Mar 19 12:18:55.582032 master-0 kubenswrapper[29612]: I0319 12:18:55.581852 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" event={"ID":"a895ff85-6349-4399-9049-3f2cdac4a96e","Type":"ContainerStarted","Data":"61035833696b1187d3e771578f30db972a45b9e6eb6ed55b4dd43918ea8b9217"} Mar 19 12:19:00.621556 master-0 kubenswrapper[29612]: I0319 12:19:00.621384 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" event={"ID":"a895ff85-6349-4399-9049-3f2cdac4a96e","Type":"ContainerStarted","Data":"e7905d847ee840655cac5388f89f91445857cc73d99177fe2e36172542052c57"} Mar 19 12:19:00.621556 master-0 kubenswrapper[29612]: I0319 12:19:00.621454 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" event={"ID":"a895ff85-6349-4399-9049-3f2cdac4a96e","Type":"ContainerStarted","Data":"ba9ad6e753322190b3b9239a76db1e2bfad2bf9e99d741061622c00c794747ed"} Mar 19 12:19:00.643351 master-0 kubenswrapper[29612]: I0319 12:19:00.643261 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-5898b679c9-rvl7w" podStartSLOduration=1.745462275 podStartE2EDuration="6.643246094s" podCreationTimestamp="2026-03-19 12:18:54 +0000 UTC" firstStartedPulling="2026-03-19 12:18:55.31711748 +0000 UTC m=+542.631286501" lastFinishedPulling="2026-03-19 12:19:00.214901299 +0000 UTC m=+547.529070320" observedRunningTime="2026-03-19 12:19:00.640140646 +0000 UTC m=+547.954309687" watchObservedRunningTime="2026-03-19 12:19:00.643246094 +0000 UTC m=+547.957415115" Mar 19 12:19:26.100544 master-0 kubenswrapper[29612]: I0319 12:19:26.100477 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v"] Mar 19 12:19:26.101959 master-0 kubenswrapper[29612]: I0319 12:19:26.101929 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.140602 master-0 kubenswrapper[29612]: I0319 12:19:26.140543 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v"] Mar 19 12:19:26.212468 master-0 kubenswrapper[29612]: I0319 12:19:26.212391 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/290421ca-8167-4eae-b01a-a8830e86c325-nova-console-recordings-pv\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.212468 master-0 kubenswrapper[29612]: I0319 12:19:26.212484 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lsm\" (UniqueName: \"kubernetes.io/projected/290421ca-8167-4eae-b01a-a8830e86c325-kube-api-access-b8lsm\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.212869 master-0 kubenswrapper[29612]: I0319 12:19:26.212521 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/290421ca-8167-4eae-b01a-a8830e86c325-os-client-config\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.314478 master-0 kubenswrapper[29612]: I0319 12:19:26.314388 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lsm\" (UniqueName: \"kubernetes.io/projected/290421ca-8167-4eae-b01a-a8830e86c325-kube-api-access-b8lsm\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.314478 master-0 kubenswrapper[29612]: I0319 12:19:26.314486 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/290421ca-8167-4eae-b01a-a8830e86c325-os-client-config\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.315609 master-0 kubenswrapper[29612]: I0319 12:19:26.314816 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/290421ca-8167-4eae-b01a-a8830e86c325-nova-console-recordings-pv\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.322754 master-0 kubenswrapper[29612]: I0319 12:19:26.319262 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/290421ca-8167-4eae-b01a-a8830e86c325-os-client-config\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.343500 master-0 kubenswrapper[29612]: I0319 12:19:26.343377 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lsm\" (UniqueName: \"kubernetes.io/projected/290421ca-8167-4eae-b01a-a8830e86c325-kube-api-access-b8lsm\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:26.985199 master-0 kubenswrapper[29612]: I0319 12:19:26.985063 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/290421ca-8167-4eae-b01a-a8830e86c325-nova-console-recordings-pv\") pod \"nova-console-recorder-5dd74b5d59-nn62v\" (UID: \"290421ca-8167-4eae-b01a-a8830e86c325\") " pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:27.032176 master-0 kubenswrapper[29612]: I0319 12:19:27.032076 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" Mar 19 12:19:27.456261 master-0 kubenswrapper[29612]: W0319 12:19:27.455864 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290421ca_8167_4eae_b01a_a8830e86c325.slice/crio-7209bd7e923f60c3c4d342d737718104a681b397184423bfacde00c5370072a4 WatchSource:0}: Error finding container 7209bd7e923f60c3c4d342d737718104a681b397184423bfacde00c5370072a4: Status 404 returned error can't find the container with id 7209bd7e923f60c3c4d342d737718104a681b397184423bfacde00c5370072a4 Mar 19 12:19:27.456773 master-0 kubenswrapper[29612]: I0319 12:19:27.456464 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v"] Mar 19 12:19:27.836718 master-0 kubenswrapper[29612]: I0319 12:19:27.836495 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" event={"ID":"290421ca-8167-4eae-b01a-a8830e86c325","Type":"ContainerStarted","Data":"7209bd7e923f60c3c4d342d737718104a681b397184423bfacde00c5370072a4"} Mar 19 12:19:36.914431 master-0 kubenswrapper[29612]: I0319 12:19:36.914334 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" event={"ID":"290421ca-8167-4eae-b01a-a8830e86c325","Type":"ContainerStarted","Data":"ddf4f8d92a62fe9baccf7ea741bb0711d184fcf883b7a2947bc34f67693ae51a"} Mar 19 12:19:37.915732 master-0 kubenswrapper[29612]: I0319 12:19:37.915597 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" event={"ID":"290421ca-8167-4eae-b01a-a8830e86c325","Type":"ContainerStarted","Data":"c542ff5039da12175ce10b225af4685a78b439db88ac53af5885b30ffd0ae7d2"} Mar 19 12:19:37.938814 master-0 kubenswrapper[29612]: I0319 12:19:37.938694 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-5dd74b5d59-nn62v" podStartSLOduration=2.424187929 podStartE2EDuration="11.938658993s" podCreationTimestamp="2026-03-19 12:19:26 +0000 UTC" firstStartedPulling="2026-03-19 12:19:27.457994129 +0000 UTC m=+574.772163160" lastFinishedPulling="2026-03-19 12:19:36.972465203 +0000 UTC m=+584.286634224" observedRunningTime="2026-03-19 12:19:37.932710714 +0000 UTC m=+585.246879735" watchObservedRunningTime="2026-03-19 12:19:37.938658993 +0000 UTC m=+585.252828014" Mar 19 12:20:06.028823 master-0 kubenswrapper[29612]: I0319 12:20:06.028740 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc"] Mar 19 12:20:06.030414 master-0 kubenswrapper[29612]: I0319 12:20:06.030367 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.033297 master-0 kubenswrapper[29612]: I0319 12:20:06.033235 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-4dr9j" Mar 19 12:20:06.048286 master-0 kubenswrapper[29612]: I0319 12:20:06.048222 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc"] Mar 19 12:20:06.108310 master-0 kubenswrapper[29612]: I0319 12:20:06.108253 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.108670 master-0 kubenswrapper[29612]: I0319 12:20:06.108630 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.108836 master-0 kubenswrapper[29612]: I0319 12:20:06.108815 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7dq\" (UniqueName: \"kubernetes.io/projected/94826a62-065b-4056-9b0e-1aa98e930ef3-kube-api-access-lg7dq\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.210488 master-0 kubenswrapper[29612]: I0319 12:20:06.210403 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.210843 master-0 kubenswrapper[29612]: I0319 12:20:06.210550 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.210843 master-0 kubenswrapper[29612]: I0319 12:20:06.210607 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7dq\" (UniqueName: \"kubernetes.io/projected/94826a62-065b-4056-9b0e-1aa98e930ef3-kube-api-access-lg7dq\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.211178 master-0 kubenswrapper[29612]: I0319 12:20:06.211110 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.212761 master-0 kubenswrapper[29612]: I0319 12:20:06.211428 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.234767 master-0 kubenswrapper[29612]: I0319 12:20:06.234688 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7dq\" (UniqueName: \"kubernetes.io/projected/94826a62-065b-4056-9b0e-1aa98e930ef3-kube-api-access-lg7dq\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.347540 master-0 kubenswrapper[29612]: I0319 12:20:06.347429 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:06.868555 master-0 kubenswrapper[29612]: I0319 12:20:06.865567 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc"] Mar 19 12:20:07.146966 master-0 kubenswrapper[29612]: I0319 12:20:07.146925 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" event={"ID":"94826a62-065b-4056-9b0e-1aa98e930ef3","Type":"ContainerStarted","Data":"459bc131bb2a7b00b31a434c46c131cc8c9792a1a4af79e77a9b9f0393067657"} Mar 19 12:20:07.147581 master-0 kubenswrapper[29612]: I0319 12:20:07.146974 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" event={"ID":"94826a62-065b-4056-9b0e-1aa98e930ef3","Type":"ContainerStarted","Data":"cdfda19cfed51815f38746137861942ff8e2f8e3c4b6b2d2d7ae3a4bf3a52c9a"} Mar 19 12:20:08.167529 master-0 kubenswrapper[29612]: I0319 12:20:08.167437 29612 generic.go:334] "Generic (PLEG): container finished" podID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerID="459bc131bb2a7b00b31a434c46c131cc8c9792a1a4af79e77a9b9f0393067657" exitCode=0 Mar 19 12:20:08.168253 master-0 kubenswrapper[29612]: I0319 12:20:08.167538 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" event={"ID":"94826a62-065b-4056-9b0e-1aa98e930ef3","Type":"ContainerDied","Data":"459bc131bb2a7b00b31a434c46c131cc8c9792a1a4af79e77a9b9f0393067657"} Mar 19 12:20:10.187829 master-0 kubenswrapper[29612]: I0319 12:20:10.187730 29612 generic.go:334] "Generic (PLEG): container finished" podID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerID="f4102e7a2a7e46297b1c9975101ea8257dcd78a1aa9caa152efe7ddc6354b5b7" exitCode=0 Mar 19 12:20:10.187829 master-0 kubenswrapper[29612]: I0319 12:20:10.187804 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" event={"ID":"94826a62-065b-4056-9b0e-1aa98e930ef3","Type":"ContainerDied","Data":"f4102e7a2a7e46297b1c9975101ea8257dcd78a1aa9caa152efe7ddc6354b5b7"} Mar 19 12:20:11.199348 master-0 kubenswrapper[29612]: I0319 12:20:11.199268 29612 generic.go:334] "Generic (PLEG): container finished" podID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerID="17581b7f7aed936206c6ab1f9060ffba8f56962384a8c4767b4665b8546186f7" exitCode=0 Mar 19 12:20:11.199348 master-0 kubenswrapper[29612]: I0319 12:20:11.199326 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" event={"ID":"94826a62-065b-4056-9b0e-1aa98e930ef3","Type":"ContainerDied","Data":"17581b7f7aed936206c6ab1f9060ffba8f56962384a8c4767b4665b8546186f7"} Mar 19 12:20:12.540775 master-0 kubenswrapper[29612]: I0319 12:20:12.540740 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:12.615874 master-0 kubenswrapper[29612]: I0319 12:20:12.615821 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-util\") pod \"94826a62-065b-4056-9b0e-1aa98e930ef3\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " Mar 19 12:20:12.616196 master-0 kubenswrapper[29612]: I0319 12:20:12.616182 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-bundle\") pod \"94826a62-065b-4056-9b0e-1aa98e930ef3\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " Mar 19 12:20:12.616313 master-0 kubenswrapper[29612]: I0319 12:20:12.616299 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg7dq\" (UniqueName: \"kubernetes.io/projected/94826a62-065b-4056-9b0e-1aa98e930ef3-kube-api-access-lg7dq\") pod \"94826a62-065b-4056-9b0e-1aa98e930ef3\" (UID: \"94826a62-065b-4056-9b0e-1aa98e930ef3\") " Mar 19 12:20:12.617109 master-0 kubenswrapper[29612]: I0319 12:20:12.617048 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-bundle" (OuterVolumeSpecName: "bundle") pod "94826a62-065b-4056-9b0e-1aa98e930ef3" (UID: "94826a62-065b-4056-9b0e-1aa98e930ef3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:12.620171 master-0 kubenswrapper[29612]: I0319 12:20:12.620111 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94826a62-065b-4056-9b0e-1aa98e930ef3-kube-api-access-lg7dq" (OuterVolumeSpecName: "kube-api-access-lg7dq") pod "94826a62-065b-4056-9b0e-1aa98e930ef3" (UID: "94826a62-065b-4056-9b0e-1aa98e930ef3"). InnerVolumeSpecName "kube-api-access-lg7dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:20:12.718479 master-0 kubenswrapper[29612]: I0319 12:20:12.718412 29612 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:12.718479 master-0 kubenswrapper[29612]: I0319 12:20:12.718462 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg7dq\" (UniqueName: \"kubernetes.io/projected/94826a62-065b-4056-9b0e-1aa98e930ef3-kube-api-access-lg7dq\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:12.966568 master-0 kubenswrapper[29612]: I0319 12:20:12.966483 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-util" (OuterVolumeSpecName: "util") pod "94826a62-065b-4056-9b0e-1aa98e930ef3" (UID: "94826a62-065b-4056-9b0e-1aa98e930ef3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:13.046188 master-0 kubenswrapper[29612]: I0319 12:20:13.045866 29612 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94826a62-065b-4056-9b0e-1aa98e930ef3-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:13.215473 master-0 kubenswrapper[29612]: I0319 12:20:13.215412 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" event={"ID":"94826a62-065b-4056-9b0e-1aa98e930ef3","Type":"ContainerDied","Data":"cdfda19cfed51815f38746137861942ff8e2f8e3c4b6b2d2d7ae3a4bf3a52c9a"} Mar 19 12:20:13.215473 master-0 kubenswrapper[29612]: I0319 12:20:13.215461 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdfda19cfed51815f38746137861942ff8e2f8e3c4b6b2d2d7ae3a4bf3a52c9a" Mar 19 12:20:13.215769 master-0 kubenswrapper[29612]: I0319 12:20:13.215521 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d42kpmc" Mar 19 12:20:18.852959 master-0 kubenswrapper[29612]: I0319 12:20:18.852899 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-b47dd7864-ctxpg"] Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: E0319 12:20:18.853215 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="util" Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: I0319 12:20:18.853228 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="util" Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: E0319 12:20:18.853236 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="pull" Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: I0319 12:20:18.853242 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="pull" Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: E0319 12:20:18.853254 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="extract" Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: I0319 12:20:18.853261 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="extract" Mar 19 12:20:18.853596 master-0 kubenswrapper[29612]: I0319 12:20:18.853392 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="94826a62-065b-4056-9b0e-1aa98e930ef3" containerName="extract" Mar 19 12:20:18.853911 master-0 kubenswrapper[29612]: I0319 12:20:18.853886 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:18.856213 master-0 kubenswrapper[29612]: I0319 12:20:18.856177 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 19 12:20:18.856492 master-0 kubenswrapper[29612]: I0319 12:20:18.856463 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 19 12:20:18.856619 master-0 kubenswrapper[29612]: I0319 12:20:18.856596 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 19 12:20:18.856761 master-0 kubenswrapper[29612]: I0319 12:20:18.856670 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 19 12:20:18.857933 master-0 kubenswrapper[29612]: I0319 12:20:18.857893 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 19 12:20:18.918787 master-0 kubenswrapper[29612]: I0319 12:20:18.918723 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-b47dd7864-ctxpg"] Mar 19 12:20:18.931572 master-0 kubenswrapper[29612]: I0319 12:20:18.931504 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-apiservice-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:18.931786 master-0 kubenswrapper[29612]: I0319 12:20:18.931577 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/246e8151-7ef4-49fd-997d-62ca024c11a5-socket-dir\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:18.931786 master-0 kubenswrapper[29612]: I0319 12:20:18.931665 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-webhook-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:18.931786 master-0 kubenswrapper[29612]: I0319 12:20:18.931701 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-metrics-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:18.931786 master-0 kubenswrapper[29612]: I0319 12:20:18.931742 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk85l\" (UniqueName: \"kubernetes.io/projected/246e8151-7ef4-49fd-997d-62ca024c11a5-kube-api-access-pk85l\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.033697 master-0 kubenswrapper[29612]: I0319 12:20:19.033567 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-apiservice-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.033697 master-0 kubenswrapper[29612]: I0319 12:20:19.033630 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/246e8151-7ef4-49fd-997d-62ca024c11a5-socket-dir\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.033998 master-0 kubenswrapper[29612]: I0319 12:20:19.033866 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-webhook-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.033998 master-0 kubenswrapper[29612]: I0319 12:20:19.033968 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-metrics-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.034194 master-0 kubenswrapper[29612]: I0319 12:20:19.034030 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk85l\" (UniqueName: \"kubernetes.io/projected/246e8151-7ef4-49fd-997d-62ca024c11a5-kube-api-access-pk85l\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.034409 master-0 kubenswrapper[29612]: I0319 12:20:19.034361 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/246e8151-7ef4-49fd-997d-62ca024c11a5-socket-dir\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.038666 master-0 kubenswrapper[29612]: I0319 12:20:19.037814 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-metrics-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.038666 master-0 kubenswrapper[29612]: I0319 12:20:19.038051 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-webhook-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.046811 master-0 kubenswrapper[29612]: I0319 12:20:19.045419 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/246e8151-7ef4-49fd-997d-62ca024c11a5-apiservice-cert\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.050575 master-0 kubenswrapper[29612]: I0319 12:20:19.050543 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk85l\" (UniqueName: \"kubernetes.io/projected/246e8151-7ef4-49fd-997d-62ca024c11a5-kube-api-access-pk85l\") pod \"lvms-operator-b47dd7864-ctxpg\" (UID: \"246e8151-7ef4-49fd-997d-62ca024c11a5\") " pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.168308 master-0 kubenswrapper[29612]: I0319 12:20:19.168245 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:19.655035 master-0 kubenswrapper[29612]: I0319 12:20:19.654974 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-b47dd7864-ctxpg"] Mar 19 12:20:20.272365 master-0 kubenswrapper[29612]: I0319 12:20:20.272282 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" event={"ID":"246e8151-7ef4-49fd-997d-62ca024c11a5","Type":"ContainerStarted","Data":"d36b3ecd07ade04e6e88c9a1430b4947088ec8ae1e4a5bebd833732f26a1b62f"} Mar 19 12:20:37.387917 master-0 kubenswrapper[29612]: I0319 12:20:37.387868 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" event={"ID":"246e8151-7ef4-49fd-997d-62ca024c11a5","Type":"ContainerStarted","Data":"0e1e560fbbfb6e31d934335446a394e57e2a5a31220f95f3e4633f729c26dc92"} Mar 19 12:20:37.388443 master-0 kubenswrapper[29612]: I0319 12:20:37.388112 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:37.391582 master-0 kubenswrapper[29612]: I0319 12:20:37.391548 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" Mar 19 12:20:37.413285 master-0 kubenswrapper[29612]: I0319 12:20:37.413016 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-b47dd7864-ctxpg" podStartSLOduration=2.680934182 podStartE2EDuration="19.412991991s" podCreationTimestamp="2026-03-19 12:20:18 +0000 UTC" firstStartedPulling="2026-03-19 12:20:19.662336674 +0000 UTC m=+626.976505695" lastFinishedPulling="2026-03-19 12:20:36.394394483 +0000 UTC m=+643.708563504" observedRunningTime="2026-03-19 12:20:37.4080437 +0000 UTC m=+644.722212721" watchObservedRunningTime="2026-03-19 12:20:37.412991991 +0000 UTC m=+644.727161032" Mar 19 12:20:40.328682 master-0 kubenswrapper[29612]: I0319 12:20:40.322405 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2"] Mar 19 12:20:40.328682 master-0 kubenswrapper[29612]: I0319 12:20:40.326378 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.334840 master-0 kubenswrapper[29612]: I0319 12:20:40.334552 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-4dr9j" Mar 19 12:20:40.358210 master-0 kubenswrapper[29612]: I0319 12:20:40.358125 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2"] Mar 19 12:20:40.385214 master-0 kubenswrapper[29612]: I0319 12:20:40.385124 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgjpl\" (UniqueName: \"kubernetes.io/projected/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-kube-api-access-rgjpl\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.385486 master-0 kubenswrapper[29612]: I0319 12:20:40.385348 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.385619 master-0 kubenswrapper[29612]: I0319 12:20:40.385577 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.437032 master-0 kubenswrapper[29612]: I0319 12:20:40.435752 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf"] Mar 19 12:20:40.437276 master-0 kubenswrapper[29612]: I0319 12:20:40.437241 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.449276 master-0 kubenswrapper[29612]: I0319 12:20:40.448848 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf"] Mar 19 12:20:40.487216 master-0 kubenswrapper[29612]: I0319 12:20:40.487144 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgjpl\" (UniqueName: \"kubernetes.io/projected/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-kube-api-access-rgjpl\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.487216 master-0 kubenswrapper[29612]: I0319 12:20:40.487221 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.487508 master-0 kubenswrapper[29612]: I0319 12:20:40.487271 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.487508 master-0 kubenswrapper[29612]: I0319 12:20:40.487303 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.487508 master-0 kubenswrapper[29612]: I0319 12:20:40.487336 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.487508 master-0 kubenswrapper[29612]: I0319 12:20:40.487449 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn7lt\" (UniqueName: \"kubernetes.io/projected/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-kube-api-access-wn7lt\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.487828 master-0 kubenswrapper[29612]: I0319 12:20:40.487792 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.487938 master-0 kubenswrapper[29612]: I0319 12:20:40.487890 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.518680 master-0 kubenswrapper[29612]: I0319 12:20:40.517593 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgjpl\" (UniqueName: \"kubernetes.io/projected/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-kube-api-access-rgjpl\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.588796 master-0 kubenswrapper[29612]: I0319 12:20:40.588262 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.588796 master-0 kubenswrapper[29612]: I0319 12:20:40.588319 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn7lt\" (UniqueName: \"kubernetes.io/projected/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-kube-api-access-wn7lt\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.588796 master-0 kubenswrapper[29612]: I0319 12:20:40.588705 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.589067 master-0 kubenswrapper[29612]: I0319 12:20:40.588868 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.589540 master-0 kubenswrapper[29612]: I0319 12:20:40.589473 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.610570 master-0 kubenswrapper[29612]: I0319 12:20:40.610507 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn7lt\" (UniqueName: \"kubernetes.io/projected/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-kube-api-access-wn7lt\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:40.657144 master-0 kubenswrapper[29612]: I0319 12:20:40.657037 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:40.753661 master-0 kubenswrapper[29612]: I0319 12:20:40.753582 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:41.101497 master-0 kubenswrapper[29612]: I0319 12:20:41.101445 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2"] Mar 19 12:20:41.179837 master-0 kubenswrapper[29612]: I0319 12:20:41.179766 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf"] Mar 19 12:20:41.183053 master-0 kubenswrapper[29612]: W0319 12:20:41.182990 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6a144f_5f3b_4cf0_938d_7301bd4669d3.slice/crio-641cd28c8e886426278f50c2f61eb8eead4e0ec15fff3e796699bdeac16ec8f6 WatchSource:0}: Error finding container 641cd28c8e886426278f50c2f61eb8eead4e0ec15fff3e796699bdeac16ec8f6: Status 404 returned error can't find the container with id 641cd28c8e886426278f50c2f61eb8eead4e0ec15fff3e796699bdeac16ec8f6 Mar 19 12:20:41.240527 master-0 kubenswrapper[29612]: I0319 12:20:41.240361 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d"] Mar 19 12:20:41.242146 master-0 kubenswrapper[29612]: I0319 12:20:41.242111 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.261296 master-0 kubenswrapper[29612]: I0319 12:20:41.260189 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d"] Mar 19 12:20:41.401247 master-0 kubenswrapper[29612]: I0319 12:20:41.401196 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.402107 master-0 kubenswrapper[29612]: I0319 12:20:41.401263 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.402107 master-0 kubenswrapper[29612]: I0319 12:20:41.401318 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhg6\" (UniqueName: \"kubernetes.io/projected/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-kube-api-access-qvhg6\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.417826 master-0 kubenswrapper[29612]: I0319 12:20:41.417745 29612 generic.go:334] "Generic (PLEG): container finished" podID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerID="ed4d8600b44f1f7487a13ae066a963a5df5140fca0b8fdc4d380b2134cc882c1" exitCode=0 Mar 19 12:20:41.417826 master-0 kubenswrapper[29612]: I0319 12:20:41.417800 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" event={"ID":"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36","Type":"ContainerDied","Data":"ed4d8600b44f1f7487a13ae066a963a5df5140fca0b8fdc4d380b2134cc882c1"} Mar 19 12:20:41.418022 master-0 kubenswrapper[29612]: I0319 12:20:41.417842 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" event={"ID":"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36","Type":"ContainerStarted","Data":"28ea790d7eedc9169c32045d63b849096321f8fa5dd93472bb869f1ca179de4e"} Mar 19 12:20:41.419754 master-0 kubenswrapper[29612]: I0319 12:20:41.419627 29612 generic.go:334] "Generic (PLEG): container finished" podID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerID="18684f21e4bf12b52705ea6ed07f115aec09e07016b60e37539e4eabedb7116f" exitCode=0 Mar 19 12:20:41.419754 master-0 kubenswrapper[29612]: I0319 12:20:41.419691 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" event={"ID":"2b6a144f-5f3b-4cf0-938d-7301bd4669d3","Type":"ContainerDied","Data":"18684f21e4bf12b52705ea6ed07f115aec09e07016b60e37539e4eabedb7116f"} Mar 19 12:20:41.419754 master-0 kubenswrapper[29612]: I0319 12:20:41.419717 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" event={"ID":"2b6a144f-5f3b-4cf0-938d-7301bd4669d3","Type":"ContainerStarted","Data":"641cd28c8e886426278f50c2f61eb8eead4e0ec15fff3e796699bdeac16ec8f6"} Mar 19 12:20:41.503676 master-0 kubenswrapper[29612]: I0319 12:20:41.503253 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.503676 master-0 kubenswrapper[29612]: I0319 12:20:41.503345 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.503676 master-0 kubenswrapper[29612]: I0319 12:20:41.503372 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhg6\" (UniqueName: \"kubernetes.io/projected/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-kube-api-access-qvhg6\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.503983 master-0 kubenswrapper[29612]: I0319 12:20:41.503959 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.504034 master-0 kubenswrapper[29612]: I0319 12:20:41.504003 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.519683 master-0 kubenswrapper[29612]: I0319 12:20:41.519616 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhg6\" (UniqueName: \"kubernetes.io/projected/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-kube-api-access-qvhg6\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:41.571198 master-0 kubenswrapper[29612]: I0319 12:20:41.571143 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:42.135240 master-0 kubenswrapper[29612]: I0319 12:20:42.134694 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d"] Mar 19 12:20:42.429595 master-0 kubenswrapper[29612]: I0319 12:20:42.429486 29612 generic.go:334] "Generic (PLEG): container finished" podID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerID="808713f8227a597daf02286d4fedc79893228c4a9d320cf38b166773185e418a" exitCode=0 Mar 19 12:20:42.429595 master-0 kubenswrapper[29612]: I0319 12:20:42.429572 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" event={"ID":"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8","Type":"ContainerDied","Data":"808713f8227a597daf02286d4fedc79893228c4a9d320cf38b166773185e418a"} Mar 19 12:20:42.430183 master-0 kubenswrapper[29612]: I0319 12:20:42.429616 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" event={"ID":"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8","Type":"ContainerStarted","Data":"1bbf59cae650d282152f858982acb34952aebb5cd054d1b5ee2c3065d5aa0ba9"} Mar 19 12:20:43.443277 master-0 kubenswrapper[29612]: I0319 12:20:43.443223 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" event={"ID":"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36","Type":"ContainerStarted","Data":"63542813f0c9c059de014705d0d5a3e2542e0450fd55a48b751aef11420b6e8f"} Mar 19 12:20:45.459057 master-0 kubenswrapper[29612]: I0319 12:20:45.459007 29612 generic.go:334] "Generic (PLEG): container finished" podID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerID="63542813f0c9c059de014705d0d5a3e2542e0450fd55a48b751aef11420b6e8f" exitCode=0 Mar 19 12:20:45.459620 master-0 kubenswrapper[29612]: I0319 12:20:45.459086 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" event={"ID":"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36","Type":"ContainerDied","Data":"63542813f0c9c059de014705d0d5a3e2542e0450fd55a48b751aef11420b6e8f"} Mar 19 12:20:45.462011 master-0 kubenswrapper[29612]: I0319 12:20:45.461983 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" event={"ID":"2b6a144f-5f3b-4cf0-938d-7301bd4669d3","Type":"ContainerStarted","Data":"cca575ffe3fbe3b9486510c415727b6a0a5b507e471715e00b0e7bfc15a4eac1"} Mar 19 12:20:46.471361 master-0 kubenswrapper[29612]: I0319 12:20:46.471313 29612 generic.go:334] "Generic (PLEG): container finished" podID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerID="cca575ffe3fbe3b9486510c415727b6a0a5b507e471715e00b0e7bfc15a4eac1" exitCode=0 Mar 19 12:20:46.471877 master-0 kubenswrapper[29612]: I0319 12:20:46.471403 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" event={"ID":"2b6a144f-5f3b-4cf0-938d-7301bd4669d3","Type":"ContainerDied","Data":"cca575ffe3fbe3b9486510c415727b6a0a5b507e471715e00b0e7bfc15a4eac1"} Mar 19 12:20:46.473497 master-0 kubenswrapper[29612]: I0319 12:20:46.473455 29612 generic.go:334] "Generic (PLEG): container finished" podID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerID="9416e4344a7928c57dad7607065f1fec4bed2a091d98ccc6d0079c4eb2e807a8" exitCode=0 Mar 19 12:20:46.473567 master-0 kubenswrapper[29612]: I0319 12:20:46.473544 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" event={"ID":"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8","Type":"ContainerDied","Data":"9416e4344a7928c57dad7607065f1fec4bed2a091d98ccc6d0079c4eb2e807a8"} Mar 19 12:20:46.475617 master-0 kubenswrapper[29612]: I0319 12:20:46.475594 29612 generic.go:334] "Generic (PLEG): container finished" podID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerID="742005e08ac54101b17d42229c7a4819a0ec6b28b82f58f8b3d0c6905477bcd9" exitCode=0 Mar 19 12:20:46.475696 master-0 kubenswrapper[29612]: I0319 12:20:46.475624 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" event={"ID":"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36","Type":"ContainerDied","Data":"742005e08ac54101b17d42229c7a4819a0ec6b28b82f58f8b3d0c6905477bcd9"} Mar 19 12:20:47.492180 master-0 kubenswrapper[29612]: I0319 12:20:47.492100 29612 generic.go:334] "Generic (PLEG): container finished" podID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerID="3a79b583aa72f881691829162022368ac77c6e877d68fb954eb4aebe95ce06d5" exitCode=0 Mar 19 12:20:47.492936 master-0 kubenswrapper[29612]: I0319 12:20:47.492265 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" event={"ID":"2b6a144f-5f3b-4cf0-938d-7301bd4669d3","Type":"ContainerDied","Data":"3a79b583aa72f881691829162022368ac77c6e877d68fb954eb4aebe95ce06d5"} Mar 19 12:20:47.496956 master-0 kubenswrapper[29612]: I0319 12:20:47.496911 29612 generic.go:334] "Generic (PLEG): container finished" podID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerID="5725898c98b1dfa58d54a51a65e762f8adb35966cd6c5230c72f0c8aac395808" exitCode=0 Mar 19 12:20:47.497245 master-0 kubenswrapper[29612]: I0319 12:20:47.497211 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" event={"ID":"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8","Type":"ContainerDied","Data":"5725898c98b1dfa58d54a51a65e762f8adb35966cd6c5230c72f0c8aac395808"} Mar 19 12:20:47.919178 master-0 kubenswrapper[29612]: I0319 12:20:47.917170 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:48.107178 master-0 kubenswrapper[29612]: I0319 12:20:48.107036 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx"] Mar 19 12:20:48.107399 master-0 kubenswrapper[29612]: E0319 12:20:48.107375 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="extract" Mar 19 12:20:48.107399 master-0 kubenswrapper[29612]: I0319 12:20:48.107391 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="extract" Mar 19 12:20:48.107500 master-0 kubenswrapper[29612]: E0319 12:20:48.107415 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="pull" Mar 19 12:20:48.107500 master-0 kubenswrapper[29612]: I0319 12:20:48.107425 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="pull" Mar 19 12:20:48.107500 master-0 kubenswrapper[29612]: E0319 12:20:48.107442 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="util" Mar 19 12:20:48.107500 master-0 kubenswrapper[29612]: I0319 12:20:48.107452 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="util" Mar 19 12:20:48.107704 master-0 kubenswrapper[29612]: I0319 12:20:48.107656 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" containerName="extract" Mar 19 12:20:48.108968 master-0 kubenswrapper[29612]: I0319 12:20:48.108936 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.119590 master-0 kubenswrapper[29612]: I0319 12:20:48.119519 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-bundle\") pod \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " Mar 19 12:20:48.119859 master-0 kubenswrapper[29612]: I0319 12:20:48.119677 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-util\") pod \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " Mar 19 12:20:48.119859 master-0 kubenswrapper[29612]: I0319 12:20:48.119784 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgjpl\" (UniqueName: \"kubernetes.io/projected/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-kube-api-access-rgjpl\") pod \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\" (UID: \"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36\") " Mar 19 12:20:48.120185 master-0 kubenswrapper[29612]: I0319 12:20:48.120146 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.120252 master-0 kubenswrapper[29612]: I0319 12:20:48.120211 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.120316 master-0 kubenswrapper[29612]: I0319 12:20:48.120258 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr6t2\" (UniqueName: \"kubernetes.io/projected/ed688952-23da-441c-9f80-1c9f24ad5e39-kube-api-access-cr6t2\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.120814 master-0 kubenswrapper[29612]: I0319 12:20:48.120689 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-bundle" (OuterVolumeSpecName: "bundle") pod "19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" (UID: "19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:48.132539 master-0 kubenswrapper[29612]: I0319 12:20:48.132491 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-kube-api-access-rgjpl" (OuterVolumeSpecName: "kube-api-access-rgjpl") pod "19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" (UID: "19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36"). InnerVolumeSpecName "kube-api-access-rgjpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:20:48.144502 master-0 kubenswrapper[29612]: I0319 12:20:48.144428 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-util" (OuterVolumeSpecName: "util") pod "19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36" (UID: "19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:48.162175 master-0 kubenswrapper[29612]: I0319 12:20:48.162103 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx"] Mar 19 12:20:48.221568 master-0 kubenswrapper[29612]: I0319 12:20:48.221497 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.221809 master-0 kubenswrapper[29612]: I0319 12:20:48.221596 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr6t2\" (UniqueName: \"kubernetes.io/projected/ed688952-23da-441c-9f80-1c9f24ad5e39-kube-api-access-cr6t2\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.221809 master-0 kubenswrapper[29612]: I0319 12:20:48.221721 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.221809 master-0 kubenswrapper[29612]: I0319 12:20:48.221782 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgjpl\" (UniqueName: \"kubernetes.io/projected/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-kube-api-access-rgjpl\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:48.221809 master-0 kubenswrapper[29612]: I0319 12:20:48.221798 29612 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:48.221809 master-0 kubenswrapper[29612]: I0319 12:20:48.221811 29612 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:48.222076 master-0 kubenswrapper[29612]: I0319 12:20:48.222039 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.222238 master-0 kubenswrapper[29612]: I0319 12:20:48.222205 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.243780 master-0 kubenswrapper[29612]: I0319 12:20:48.238522 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr6t2\" (UniqueName: \"kubernetes.io/projected/ed688952-23da-441c-9f80-1c9f24ad5e39-kube-api-access-cr6t2\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.430423 master-0 kubenswrapper[29612]: I0319 12:20:48.430148 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:48.510222 master-0 kubenswrapper[29612]: I0319 12:20:48.510111 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" event={"ID":"19e5d9b7-46c2-4cd2-ac44-4ff6a72c6c36","Type":"ContainerDied","Data":"28ea790d7eedc9169c32045d63b849096321f8fa5dd93472bb869f1ca179de4e"} Mar 19 12:20:48.510222 master-0 kubenswrapper[29612]: I0319 12:20:48.510154 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xv7t2" Mar 19 12:20:48.510222 master-0 kubenswrapper[29612]: I0319 12:20:48.510161 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ea790d7eedc9169c32045d63b849096321f8fa5dd93472bb869f1ca179de4e" Mar 19 12:20:48.902377 master-0 kubenswrapper[29612]: I0319 12:20:48.902338 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx"] Mar 19 12:20:48.920820 master-0 kubenswrapper[29612]: W0319 12:20:48.920765 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded688952_23da_441c_9f80_1c9f24ad5e39.slice/crio-66a63eb163be0eaa2afe831360f44b5d4159882e620750f55d0b067c9ba9c098 WatchSource:0}: Error finding container 66a63eb163be0eaa2afe831360f44b5d4159882e620750f55d0b067c9ba9c098: Status 404 returned error can't find the container with id 66a63eb163be0eaa2afe831360f44b5d4159882e620750f55d0b067c9ba9c098 Mar 19 12:20:49.074054 master-0 kubenswrapper[29612]: I0319 12:20:49.073991 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:49.076211 master-0 kubenswrapper[29612]: I0319 12:20:49.076166 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:49.156611 master-0 kubenswrapper[29612]: I0319 12:20:49.156534 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn7lt\" (UniqueName: \"kubernetes.io/projected/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-kube-api-access-wn7lt\") pod \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " Mar 19 12:20:49.156911 master-0 kubenswrapper[29612]: I0319 12:20:49.156866 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-util\") pod \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " Mar 19 12:20:49.156951 master-0 kubenswrapper[29612]: I0319 12:20:49.156916 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-bundle\") pod \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\" (UID: \"2b6a144f-5f3b-4cf0-938d-7301bd4669d3\") " Mar 19 12:20:49.156951 master-0 kubenswrapper[29612]: I0319 12:20:49.156939 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-util\") pod \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " Mar 19 12:20:49.157020 master-0 kubenswrapper[29612]: I0319 12:20:49.156954 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-bundle\") pod \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " Mar 19 12:20:49.157020 master-0 kubenswrapper[29612]: I0319 12:20:49.156985 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvhg6\" (UniqueName: \"kubernetes.io/projected/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-kube-api-access-qvhg6\") pod \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\" (UID: \"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8\") " Mar 19 12:20:49.158262 master-0 kubenswrapper[29612]: I0319 12:20:49.157865 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-bundle" (OuterVolumeSpecName: "bundle") pod "cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" (UID: "cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:49.158506 master-0 kubenswrapper[29612]: I0319 12:20:49.158443 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-bundle" (OuterVolumeSpecName: "bundle") pod "2b6a144f-5f3b-4cf0-938d-7301bd4669d3" (UID: "2b6a144f-5f3b-4cf0-938d-7301bd4669d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:49.159995 master-0 kubenswrapper[29612]: I0319 12:20:49.159956 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-kube-api-access-wn7lt" (OuterVolumeSpecName: "kube-api-access-wn7lt") pod "2b6a144f-5f3b-4cf0-938d-7301bd4669d3" (UID: "2b6a144f-5f3b-4cf0-938d-7301bd4669d3"). InnerVolumeSpecName "kube-api-access-wn7lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:20:49.161026 master-0 kubenswrapper[29612]: I0319 12:20:49.160998 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-kube-api-access-qvhg6" (OuterVolumeSpecName: "kube-api-access-qvhg6") pod "cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" (UID: "cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8"). InnerVolumeSpecName "kube-api-access-qvhg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:20:49.177715 master-0 kubenswrapper[29612]: I0319 12:20:49.174948 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-util" (OuterVolumeSpecName: "util") pod "cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" (UID: "cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:49.181492 master-0 kubenswrapper[29612]: I0319 12:20:49.181418 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-util" (OuterVolumeSpecName: "util") pod "2b6a144f-5f3b-4cf0-938d-7301bd4669d3" (UID: "2b6a144f-5f3b-4cf0-938d-7301bd4669d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:49.258391 master-0 kubenswrapper[29612]: I0319 12:20:49.258354 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn7lt\" (UniqueName: \"kubernetes.io/projected/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-kube-api-access-wn7lt\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:49.258391 master-0 kubenswrapper[29612]: I0319 12:20:49.258390 29612 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:49.258391 master-0 kubenswrapper[29612]: I0319 12:20:49.258401 29612 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6a144f-5f3b-4cf0-938d-7301bd4669d3-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:49.258391 master-0 kubenswrapper[29612]: I0319 12:20:49.258410 29612 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:49.258774 master-0 kubenswrapper[29612]: I0319 12:20:49.258418 29612 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:49.258774 master-0 kubenswrapper[29612]: I0319 12:20:49.258426 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvhg6\" (UniqueName: \"kubernetes.io/projected/cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8-kube-api-access-qvhg6\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:49.519741 master-0 kubenswrapper[29612]: I0319 12:20:49.519610 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" event={"ID":"cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8","Type":"ContainerDied","Data":"1bbf59cae650d282152f858982acb34952aebb5cd054d1b5ee2c3065d5aa0ba9"} Mar 19 12:20:49.519741 master-0 kubenswrapper[29612]: I0319 12:20:49.519672 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbf59cae650d282152f858982acb34952aebb5cd054d1b5ee2c3065d5aa0ba9" Mar 19 12:20:49.520254 master-0 kubenswrapper[29612]: I0319 12:20:49.520236 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xjh2d" Mar 19 12:20:49.522421 master-0 kubenswrapper[29612]: I0319 12:20:49.522371 29612 generic.go:334] "Generic (PLEG): container finished" podID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerID="bbdc063a0a3b0b6afa89d27b8f24f4f9aac096fe9915ea97e28b7e1fe97974b5" exitCode=0 Mar 19 12:20:49.522528 master-0 kubenswrapper[29612]: I0319 12:20:49.522429 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerDied","Data":"bbdc063a0a3b0b6afa89d27b8f24f4f9aac096fe9915ea97e28b7e1fe97974b5"} Mar 19 12:20:49.523073 master-0 kubenswrapper[29612]: I0319 12:20:49.523009 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerStarted","Data":"66a63eb163be0eaa2afe831360f44b5d4159882e620750f55d0b067c9ba9c098"} Mar 19 12:20:49.525495 master-0 kubenswrapper[29612]: I0319 12:20:49.525437 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" event={"ID":"2b6a144f-5f3b-4cf0-938d-7301bd4669d3","Type":"ContainerDied","Data":"641cd28c8e886426278f50c2f61eb8eead4e0ec15fff3e796699bdeac16ec8f6"} Mar 19 12:20:49.525554 master-0 kubenswrapper[29612]: I0319 12:20:49.525512 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641cd28c8e886426278f50c2f61eb8eead4e0ec15fff3e796699bdeac16ec8f6" Mar 19 12:20:49.525660 master-0 kubenswrapper[29612]: I0319 12:20:49.525627 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5rl9gf" Mar 19 12:20:52.554008 master-0 kubenswrapper[29612]: I0319 12:20:52.553935 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerStarted","Data":"7b86a4179d0ef21dc74fd6ba694f44ac06ad3b3aee149ad3b49866f823cae22a"} Mar 19 12:20:53.563936 master-0 kubenswrapper[29612]: I0319 12:20:53.563882 29612 generic.go:334] "Generic (PLEG): container finished" podID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerID="7b86a4179d0ef21dc74fd6ba694f44ac06ad3b3aee149ad3b49866f823cae22a" exitCode=0 Mar 19 12:20:53.563936 master-0 kubenswrapper[29612]: I0319 12:20:53.563930 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerDied","Data":"7b86a4179d0ef21dc74fd6ba694f44ac06ad3b3aee149ad3b49866f823cae22a"} Mar 19 12:20:54.574787 master-0 kubenswrapper[29612]: I0319 12:20:54.574727 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerStarted","Data":"187cfa3985ab394267fc1419c5fc3d91136e1b7eb15d7a568122ca4c05d41871"} Mar 19 12:20:55.291886 master-0 kubenswrapper[29612]: I0319 12:20:55.291769 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" podStartSLOduration=4.506982149 podStartE2EDuration="7.291747113s" podCreationTimestamp="2026-03-19 12:20:48 +0000 UTC" firstStartedPulling="2026-03-19 12:20:49.524344675 +0000 UTC m=+656.838513696" lastFinishedPulling="2026-03-19 12:20:52.309109619 +0000 UTC m=+659.623278660" observedRunningTime="2026-03-19 12:20:55.287176584 +0000 UTC m=+662.601345605" watchObservedRunningTime="2026-03-19 12:20:55.291747113 +0000 UTC m=+662.605916134" Mar 19 12:20:55.584117 master-0 kubenswrapper[29612]: I0319 12:20:55.583988 29612 generic.go:334] "Generic (PLEG): container finished" podID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerID="187cfa3985ab394267fc1419c5fc3d91136e1b7eb15d7a568122ca4c05d41871" exitCode=0 Mar 19 12:20:55.584117 master-0 kubenswrapper[29612]: I0319 12:20:55.584042 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerDied","Data":"187cfa3985ab394267fc1419c5fc3d91136e1b7eb15d7a568122ca4c05d41871"} Mar 19 12:20:56.992665 master-0 kubenswrapper[29612]: I0319 12:20:56.990891 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:20:57.090798 master-0 kubenswrapper[29612]: I0319 12:20:57.088120 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr6t2\" (UniqueName: \"kubernetes.io/projected/ed688952-23da-441c-9f80-1c9f24ad5e39-kube-api-access-cr6t2\") pod \"ed688952-23da-441c-9f80-1c9f24ad5e39\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " Mar 19 12:20:57.090798 master-0 kubenswrapper[29612]: I0319 12:20:57.088252 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-util\") pod \"ed688952-23da-441c-9f80-1c9f24ad5e39\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " Mar 19 12:20:57.090798 master-0 kubenswrapper[29612]: I0319 12:20:57.088403 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-bundle\") pod \"ed688952-23da-441c-9f80-1c9f24ad5e39\" (UID: \"ed688952-23da-441c-9f80-1c9f24ad5e39\") " Mar 19 12:20:57.091101 master-0 kubenswrapper[29612]: I0319 12:20:57.090887 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-bundle" (OuterVolumeSpecName: "bundle") pod "ed688952-23da-441c-9f80-1c9f24ad5e39" (UID: "ed688952-23da-441c-9f80-1c9f24ad5e39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:57.092066 master-0 kubenswrapper[29612]: I0319 12:20:57.092008 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed688952-23da-441c-9f80-1c9f24ad5e39-kube-api-access-cr6t2" (OuterVolumeSpecName: "kube-api-access-cr6t2") pod "ed688952-23da-441c-9f80-1c9f24ad5e39" (UID: "ed688952-23da-441c-9f80-1c9f24ad5e39"). InnerVolumeSpecName "kube-api-access-cr6t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:20:57.100967 master-0 kubenswrapper[29612]: I0319 12:20:57.100911 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-util" (OuterVolumeSpecName: "util") pod "ed688952-23da-441c-9f80-1c9f24ad5e39" (UID: "ed688952-23da-441c-9f80-1c9f24ad5e39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:20:57.191717 master-0 kubenswrapper[29612]: I0319 12:20:57.190211 29612 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:57.191717 master-0 kubenswrapper[29612]: I0319 12:20:57.190250 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr6t2\" (UniqueName: \"kubernetes.io/projected/ed688952-23da-441c-9f80-1c9f24ad5e39-kube-api-access-cr6t2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:57.191717 master-0 kubenswrapper[29612]: I0319 12:20:57.190262 29612 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed688952-23da-441c-9f80-1c9f24ad5e39-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:20:57.599502 master-0 kubenswrapper[29612]: I0319 12:20:57.599441 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" event={"ID":"ed688952-23da-441c-9f80-1c9f24ad5e39","Type":"ContainerDied","Data":"66a63eb163be0eaa2afe831360f44b5d4159882e620750f55d0b067c9ba9c098"} Mar 19 12:20:57.599502 master-0 kubenswrapper[29612]: I0319 12:20:57.599487 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a63eb163be0eaa2afe831360f44b5d4159882e620750f55d0b067c9ba9c098" Mar 19 12:20:57.599502 master-0 kubenswrapper[29612]: I0319 12:20:57.599504 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726nv6mx" Mar 19 12:21:01.847730 master-0 kubenswrapper[29612]: I0319 12:21:01.847669 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp"] Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.847973 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="extract" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.847987 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="extract" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848010 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="extract" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848017 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="extract" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848039 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="util" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848047 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="util" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848059 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="extract" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848066 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="extract" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848084 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="util" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848091 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="util" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848099 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="pull" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848105 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="pull" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848127 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="util" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848133 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="util" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848146 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="pull" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848152 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="pull" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: E0319 12:21:01.848165 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="pull" Mar 19 12:21:01.848260 master-0 kubenswrapper[29612]: I0319 12:21:01.848171 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="pull" Mar 19 12:21:01.848921 master-0 kubenswrapper[29612]: I0319 12:21:01.848296 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf9844d-ca29-4e9a-b46a-4d7b67e7c3c8" containerName="extract" Mar 19 12:21:01.848921 master-0 kubenswrapper[29612]: I0319 12:21:01.848312 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed688952-23da-441c-9f80-1c9f24ad5e39" containerName="extract" Mar 19 12:21:01.848921 master-0 kubenswrapper[29612]: I0319 12:21:01.848345 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6a144f-5f3b-4cf0-938d-7301bd4669d3" containerName="extract" Mar 19 12:21:01.848921 master-0 kubenswrapper[29612]: I0319 12:21:01.848900 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:01.851185 master-0 kubenswrapper[29612]: I0319 12:21:01.851106 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 19 12:21:01.851686 master-0 kubenswrapper[29612]: I0319 12:21:01.851658 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 19 12:21:01.871041 master-0 kubenswrapper[29612]: I0319 12:21:01.870981 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp"] Mar 19 12:21:01.960602 master-0 kubenswrapper[29612]: I0319 12:21:01.960528 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4630d6c9-4ed9-4d9a-9266-ca7f4970a337-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j7llp\" (UID: \"4630d6c9-4ed9-4d9a-9266-ca7f4970a337\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:01.960845 master-0 kubenswrapper[29612]: I0319 12:21:01.960612 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kghp\" (UniqueName: \"kubernetes.io/projected/4630d6c9-4ed9-4d9a-9266-ca7f4970a337-kube-api-access-9kghp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j7llp\" (UID: \"4630d6c9-4ed9-4d9a-9266-ca7f4970a337\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:02.061981 master-0 kubenswrapper[29612]: I0319 12:21:02.061912 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4630d6c9-4ed9-4d9a-9266-ca7f4970a337-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j7llp\" (UID: \"4630d6c9-4ed9-4d9a-9266-ca7f4970a337\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:02.061981 master-0 kubenswrapper[29612]: I0319 12:21:02.061986 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kghp\" (UniqueName: \"kubernetes.io/projected/4630d6c9-4ed9-4d9a-9266-ca7f4970a337-kube-api-access-9kghp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j7llp\" (UID: \"4630d6c9-4ed9-4d9a-9266-ca7f4970a337\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:02.062572 master-0 kubenswrapper[29612]: I0319 12:21:02.062542 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4630d6c9-4ed9-4d9a-9266-ca7f4970a337-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j7llp\" (UID: \"4630d6c9-4ed9-4d9a-9266-ca7f4970a337\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:02.084651 master-0 kubenswrapper[29612]: I0319 12:21:02.084577 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kghp\" (UniqueName: \"kubernetes.io/projected/4630d6c9-4ed9-4d9a-9266-ca7f4970a337-kube-api-access-9kghp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-j7llp\" (UID: \"4630d6c9-4ed9-4d9a-9266-ca7f4970a337\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:02.174858 master-0 kubenswrapper[29612]: I0319 12:21:02.174769 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" Mar 19 12:21:02.591676 master-0 kubenswrapper[29612]: I0319 12:21:02.591088 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp"] Mar 19 12:21:02.663210 master-0 kubenswrapper[29612]: I0319 12:21:02.661838 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" event={"ID":"4630d6c9-4ed9-4d9a-9266-ca7f4970a337","Type":"ContainerStarted","Data":"8e0f6dfadbefb14c738b0ea8d2d31fad9888cb4ec4158da9ca9399aea5603eb1"} Mar 19 12:21:04.563024 master-0 kubenswrapper[29612]: I0319 12:21:04.562940 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j"] Mar 19 12:21:04.564004 master-0 kubenswrapper[29612]: I0319 12:21:04.563933 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" Mar 19 12:21:04.569333 master-0 kubenswrapper[29612]: I0319 12:21:04.569264 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 12:21:04.569333 master-0 kubenswrapper[29612]: I0319 12:21:04.569229 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 12:21:04.583072 master-0 kubenswrapper[29612]: I0319 12:21:04.583014 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j"] Mar 19 12:21:04.613711 master-0 kubenswrapper[29612]: I0319 12:21:04.613615 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8gcw\" (UniqueName: \"kubernetes.io/projected/df7823bf-2d07-4834-9afe-d8427193bcbe-kube-api-access-w8gcw\") pod \"nmstate-operator-796d4cfff4-4pg9j\" (UID: \"df7823bf-2d07-4834-9afe-d8427193bcbe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" Mar 19 12:21:04.716081 master-0 kubenswrapper[29612]: I0319 12:21:04.716025 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8gcw\" (UniqueName: \"kubernetes.io/projected/df7823bf-2d07-4834-9afe-d8427193bcbe-kube-api-access-w8gcw\") pod \"nmstate-operator-796d4cfff4-4pg9j\" (UID: \"df7823bf-2d07-4834-9afe-d8427193bcbe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" Mar 19 12:21:04.750870 master-0 kubenswrapper[29612]: I0319 12:21:04.746488 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8gcw\" (UniqueName: \"kubernetes.io/projected/df7823bf-2d07-4834-9afe-d8427193bcbe-kube-api-access-w8gcw\") pod \"nmstate-operator-796d4cfff4-4pg9j\" (UID: \"df7823bf-2d07-4834-9afe-d8427193bcbe\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" Mar 19 12:21:04.897183 master-0 kubenswrapper[29612]: I0319 12:21:04.897106 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" Mar 19 12:21:05.389352 master-0 kubenswrapper[29612]: I0319 12:21:05.389084 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j"] Mar 19 12:21:06.847075 master-0 kubenswrapper[29612]: W0319 12:21:06.847000 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7823bf_2d07_4834_9afe_d8427193bcbe.slice/crio-8d81b59c0ba31209108e351aee416dc1aeffbe04c3c2a21066bbf97d65dbb08b WatchSource:0}: Error finding container 8d81b59c0ba31209108e351aee416dc1aeffbe04c3c2a21066bbf97d65dbb08b: Status 404 returned error can't find the container with id 8d81b59c0ba31209108e351aee416dc1aeffbe04c3c2a21066bbf97d65dbb08b Mar 19 12:21:07.713901 master-0 kubenswrapper[29612]: I0319 12:21:07.713808 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" event={"ID":"df7823bf-2d07-4834-9afe-d8427193bcbe","Type":"ContainerStarted","Data":"8d81b59c0ba31209108e351aee416dc1aeffbe04c3c2a21066bbf97d65dbb08b"} Mar 19 12:21:08.730305 master-0 kubenswrapper[29612]: I0319 12:21:08.729880 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" event={"ID":"4630d6c9-4ed9-4d9a-9266-ca7f4970a337","Type":"ContainerStarted","Data":"41de07600fadeb9d2451c26372e4c802f02e97754530ac63ae40010fda90780d"} Mar 19 12:21:08.912504 master-0 kubenswrapper[29612]: I0319 12:21:08.912407 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-j7llp" podStartSLOduration=2.327598798 podStartE2EDuration="7.912391252s" podCreationTimestamp="2026-03-19 12:21:01 +0000 UTC" firstStartedPulling="2026-03-19 12:21:02.586363707 +0000 UTC m=+669.900532728" lastFinishedPulling="2026-03-19 12:21:08.171156161 +0000 UTC m=+675.485325182" observedRunningTime="2026-03-19 12:21:08.910283052 +0000 UTC m=+676.224452073" watchObservedRunningTime="2026-03-19 12:21:08.912391252 +0000 UTC m=+676.226560273" Mar 19 12:21:12.784366 master-0 kubenswrapper[29612]: I0319 12:21:12.784312 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" event={"ID":"df7823bf-2d07-4834-9afe-d8427193bcbe","Type":"ContainerStarted","Data":"63ecc749c2b84fb447bddee940d7b3fa1950fc1b1ceef4c8400e6cd02873aa0b"} Mar 19 12:21:12.819229 master-0 kubenswrapper[29612]: I0319 12:21:12.819168 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bp5fk"] Mar 19 12:21:12.820868 master-0 kubenswrapper[29612]: I0319 12:21:12.820845 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:12.823983 master-0 kubenswrapper[29612]: I0319 12:21:12.823949 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 12:21:12.824480 master-0 kubenswrapper[29612]: I0319 12:21:12.824211 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 12:21:12.854880 master-0 kubenswrapper[29612]: I0319 12:21:12.854801 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqh4k\" (UniqueName: \"kubernetes.io/projected/2d19589e-6eec-4a65-9799-76abd82f3b20-kube-api-access-nqh4k\") pod \"cert-manager-webhook-6888856db4-bp5fk\" (UID: \"2d19589e-6eec-4a65-9799-76abd82f3b20\") " pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:12.855151 master-0 kubenswrapper[29612]: I0319 12:21:12.854973 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d19589e-6eec-4a65-9799-76abd82f3b20-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bp5fk\" (UID: \"2d19589e-6eec-4a65-9799-76abd82f3b20\") " pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:12.863794 master-0 kubenswrapper[29612]: I0319 12:21:12.863743 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bp5fk"] Mar 19 12:21:12.867878 master-0 kubenswrapper[29612]: I0319 12:21:12.867795 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4pg9j" podStartSLOduration=3.925175247 podStartE2EDuration="8.867773732s" podCreationTimestamp="2026-03-19 12:21:04 +0000 UTC" firstStartedPulling="2026-03-19 12:21:06.849382855 +0000 UTC m=+674.163551916" lastFinishedPulling="2026-03-19 12:21:11.79198138 +0000 UTC m=+679.106150401" observedRunningTime="2026-03-19 12:21:12.863263204 +0000 UTC m=+680.177432235" watchObservedRunningTime="2026-03-19 12:21:12.867773732 +0000 UTC m=+680.181942763" Mar 19 12:21:12.959667 master-0 kubenswrapper[29612]: I0319 12:21:12.958956 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d19589e-6eec-4a65-9799-76abd82f3b20-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bp5fk\" (UID: \"2d19589e-6eec-4a65-9799-76abd82f3b20\") " pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:12.959667 master-0 kubenswrapper[29612]: I0319 12:21:12.959116 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqh4k\" (UniqueName: \"kubernetes.io/projected/2d19589e-6eec-4a65-9799-76abd82f3b20-kube-api-access-nqh4k\") pod \"cert-manager-webhook-6888856db4-bp5fk\" (UID: \"2d19589e-6eec-4a65-9799-76abd82f3b20\") " pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:13.024370 master-0 kubenswrapper[29612]: I0319 12:21:13.024323 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d19589e-6eec-4a65-9799-76abd82f3b20-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bp5fk\" (UID: \"2d19589e-6eec-4a65-9799-76abd82f3b20\") " pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:13.028657 master-0 kubenswrapper[29612]: I0319 12:21:13.025906 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqh4k\" (UniqueName: \"kubernetes.io/projected/2d19589e-6eec-4a65-9799-76abd82f3b20-kube-api-access-nqh4k\") pod \"cert-manager-webhook-6888856db4-bp5fk\" (UID: \"2d19589e-6eec-4a65-9799-76abd82f3b20\") " pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:13.140077 master-0 kubenswrapper[29612]: I0319 12:21:13.139965 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:13.844813 master-0 kubenswrapper[29612]: I0319 12:21:13.841098 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bp5fk"] Mar 19 12:21:14.600454 master-0 kubenswrapper[29612]: I0319 12:21:14.600392 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4"] Mar 19 12:21:14.601380 master-0 kubenswrapper[29612]: I0319 12:21:14.601352 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.614248 master-0 kubenswrapper[29612]: I0319 12:21:14.614184 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 12:21:14.614473 master-0 kubenswrapper[29612]: I0319 12:21:14.614462 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 12:21:14.614609 master-0 kubenswrapper[29612]: I0319 12:21:14.614587 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 12:21:14.614756 master-0 kubenswrapper[29612]: I0319 12:21:14.614735 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 12:21:14.693376 master-0 kubenswrapper[29612]: I0319 12:21:14.693303 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a6353c91-3e4d-4849-8628-a14850facf4d-webhook-cert\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.693376 master-0 kubenswrapper[29612]: I0319 12:21:14.693368 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x28wx\" (UniqueName: \"kubernetes.io/projected/a6353c91-3e4d-4849-8628-a14850facf4d-kube-api-access-x28wx\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.693654 master-0 kubenswrapper[29612]: I0319 12:21:14.693428 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a6353c91-3e4d-4849-8628-a14850facf4d-apiservice-cert\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.748278 master-0 kubenswrapper[29612]: I0319 12:21:14.748184 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4"] Mar 19 12:21:14.795458 master-0 kubenswrapper[29612]: I0319 12:21:14.795161 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a6353c91-3e4d-4849-8628-a14850facf4d-webhook-cert\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.795458 master-0 kubenswrapper[29612]: I0319 12:21:14.795431 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x28wx\" (UniqueName: \"kubernetes.io/projected/a6353c91-3e4d-4849-8628-a14850facf4d-kube-api-access-x28wx\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.795918 master-0 kubenswrapper[29612]: I0319 12:21:14.795519 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a6353c91-3e4d-4849-8628-a14850facf4d-apiservice-cert\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.807889 master-0 kubenswrapper[29612]: I0319 12:21:14.807534 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a6353c91-3e4d-4849-8628-a14850facf4d-webhook-cert\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.807889 master-0 kubenswrapper[29612]: I0319 12:21:14.807702 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a6353c91-3e4d-4849-8628-a14850facf4d-apiservice-cert\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.843833 master-0 kubenswrapper[29612]: I0319 12:21:14.842487 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" event={"ID":"2d19589e-6eec-4a65-9799-76abd82f3b20","Type":"ContainerStarted","Data":"d21cf3129fd3203b9aaca1ee5dada374402d9f9914e42896f58328bf2971d009"} Mar 19 12:21:14.848453 master-0 kubenswrapper[29612]: I0319 12:21:14.844841 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x28wx\" (UniqueName: \"kubernetes.io/projected/a6353c91-3e4d-4849-8628-a14850facf4d-kube-api-access-x28wx\") pod \"metallb-operator-controller-manager-6f6b9794d7-h9td4\" (UID: \"a6353c91-3e4d-4849-8628-a14850facf4d\") " pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:14.923229 master-0 kubenswrapper[29612]: I0319 12:21:14.921410 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:15.097313 master-0 kubenswrapper[29612]: I0319 12:21:15.096796 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bhb5z"] Mar 19 12:21:15.105087 master-0 kubenswrapper[29612]: I0319 12:21:15.105045 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.119756 master-0 kubenswrapper[29612]: I0319 12:21:15.119704 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bhb5z"] Mar 19 12:21:15.216463 master-0 kubenswrapper[29612]: I0319 12:21:15.216309 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afca29b3-d73a-4c1e-b610-7c9db42ffe99-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bhb5z\" (UID: \"afca29b3-d73a-4c1e-b610-7c9db42ffe99\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.223066 master-0 kubenswrapper[29612]: I0319 12:21:15.223026 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqzzf\" (UniqueName: \"kubernetes.io/projected/afca29b3-d73a-4c1e-b610-7c9db42ffe99-kube-api-access-rqzzf\") pod \"cert-manager-cainjector-5545bd876-bhb5z\" (UID: \"afca29b3-d73a-4c1e-b610-7c9db42ffe99\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.311689 master-0 kubenswrapper[29612]: I0319 12:21:15.311600 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4"] Mar 19 12:21:15.331671 master-0 kubenswrapper[29612]: I0319 12:21:15.329489 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqzzf\" (UniqueName: \"kubernetes.io/projected/afca29b3-d73a-4c1e-b610-7c9db42ffe99-kube-api-access-rqzzf\") pod \"cert-manager-cainjector-5545bd876-bhb5z\" (UID: \"afca29b3-d73a-4c1e-b610-7c9db42ffe99\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.331671 master-0 kubenswrapper[29612]: I0319 12:21:15.329668 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afca29b3-d73a-4c1e-b610-7c9db42ffe99-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bhb5z\" (UID: \"afca29b3-d73a-4c1e-b610-7c9db42ffe99\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.352317 master-0 kubenswrapper[29612]: I0319 12:21:15.352255 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp"] Mar 19 12:21:15.358694 master-0 kubenswrapper[29612]: I0319 12:21:15.353469 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.358694 master-0 kubenswrapper[29612]: I0319 12:21:15.357794 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 12:21:15.367433 master-0 kubenswrapper[29612]: I0319 12:21:15.361968 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 12:21:15.385283 master-0 kubenswrapper[29612]: I0319 12:21:15.385076 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp"] Mar 19 12:21:15.389409 master-0 kubenswrapper[29612]: I0319 12:21:15.389361 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/afca29b3-d73a-4c1e-b610-7c9db42ffe99-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-bhb5z\" (UID: \"afca29b3-d73a-4c1e-b610-7c9db42ffe99\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.402575 master-0 kubenswrapper[29612]: I0319 12:21:15.402528 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqzzf\" (UniqueName: \"kubernetes.io/projected/afca29b3-d73a-4c1e-b610-7c9db42ffe99-kube-api-access-rqzzf\") pod \"cert-manager-cainjector-5545bd876-bhb5z\" (UID: \"afca29b3-d73a-4c1e-b610-7c9db42ffe99\") " pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.437675 master-0 kubenswrapper[29612]: I0319 12:21:15.434510 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28a635ab-5a06-492e-9681-dfe7067424f4-webhook-cert\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.437675 master-0 kubenswrapper[29612]: I0319 12:21:15.434586 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28a635ab-5a06-492e-9681-dfe7067424f4-apiservice-cert\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.437675 master-0 kubenswrapper[29612]: I0319 12:21:15.434611 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxvs\" (UniqueName: \"kubernetes.io/projected/28a635ab-5a06-492e-9681-dfe7067424f4-kube-api-access-rqxvs\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.451029 master-0 kubenswrapper[29612]: I0319 12:21:15.450957 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" Mar 19 12:21:15.537333 master-0 kubenswrapper[29612]: I0319 12:21:15.536619 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28a635ab-5a06-492e-9681-dfe7067424f4-webhook-cert\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.537333 master-0 kubenswrapper[29612]: I0319 12:21:15.536747 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28a635ab-5a06-492e-9681-dfe7067424f4-apiservice-cert\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.537333 master-0 kubenswrapper[29612]: I0319 12:21:15.536776 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxvs\" (UniqueName: \"kubernetes.io/projected/28a635ab-5a06-492e-9681-dfe7067424f4-kube-api-access-rqxvs\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.549765 master-0 kubenswrapper[29612]: I0319 12:21:15.547396 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/28a635ab-5a06-492e-9681-dfe7067424f4-apiservice-cert\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.550811 master-0 kubenswrapper[29612]: I0319 12:21:15.550739 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/28a635ab-5a06-492e-9681-dfe7067424f4-webhook-cert\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.603959 master-0 kubenswrapper[29612]: I0319 12:21:15.600950 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxvs\" (UniqueName: \"kubernetes.io/projected/28a635ab-5a06-492e-9681-dfe7067424f4-kube-api-access-rqxvs\") pod \"metallb-operator-webhook-server-69f8686c44-wrfdp\" (UID: \"28a635ab-5a06-492e-9681-dfe7067424f4\") " pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.689668 master-0 kubenswrapper[29612]: I0319 12:21:15.686950 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:15.849569 master-0 kubenswrapper[29612]: I0319 12:21:15.849488 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" event={"ID":"a6353c91-3e4d-4849-8628-a14850facf4d","Type":"ContainerStarted","Data":"5b51a8086d3688bed8f610041f851963193f68896dcf570ce66c5f693d46f35c"} Mar 19 12:21:16.147757 master-0 kubenswrapper[29612]: I0319 12:21:16.144825 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-bhb5z"] Mar 19 12:21:16.165671 master-0 kubenswrapper[29612]: W0319 12:21:16.164891 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafca29b3_d73a_4c1e_b610_7c9db42ffe99.slice/crio-8e3a76a5f01bd1ba9a39fb18e173bb3f36fbd6235a305f68cc95a48c42e98b52 WatchSource:0}: Error finding container 8e3a76a5f01bd1ba9a39fb18e173bb3f36fbd6235a305f68cc95a48c42e98b52: Status 404 returned error can't find the container with id 8e3a76a5f01bd1ba9a39fb18e173bb3f36fbd6235a305f68cc95a48c42e98b52 Mar 19 12:21:16.418791 master-0 kubenswrapper[29612]: I0319 12:21:16.413724 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp"] Mar 19 12:21:16.860448 master-0 kubenswrapper[29612]: I0319 12:21:16.860304 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" event={"ID":"afca29b3-d73a-4c1e-b610-7c9db42ffe99","Type":"ContainerStarted","Data":"8e3a76a5f01bd1ba9a39fb18e173bb3f36fbd6235a305f68cc95a48c42e98b52"} Mar 19 12:21:16.865650 master-0 kubenswrapper[29612]: I0319 12:21:16.863378 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" event={"ID":"28a635ab-5a06-492e-9681-dfe7067424f4","Type":"ContainerStarted","Data":"ffb0c2f289ad581a37c38c05cb602097718007e3751d68e1b7d8fda11c466725"} Mar 19 12:21:23.090170 master-0 kubenswrapper[29612]: I0319 12:21:23.090098 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-kcqh7"] Mar 19 12:21:23.092559 master-0 kubenswrapper[29612]: I0319 12:21:23.092483 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.135068 master-0 kubenswrapper[29612]: I0319 12:21:23.134996 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-kcqh7"] Mar 19 12:21:23.249284 master-0 kubenswrapper[29612]: I0319 12:21:23.249149 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4ed7d5a-c42c-49fd-96ec-d22b136af504-bound-sa-token\") pod \"cert-manager-545d4d4674-kcqh7\" (UID: \"d4ed7d5a-c42c-49fd-96ec-d22b136af504\") " pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.249560 master-0 kubenswrapper[29612]: I0319 12:21:23.249329 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q698z\" (UniqueName: \"kubernetes.io/projected/d4ed7d5a-c42c-49fd-96ec-d22b136af504-kube-api-access-q698z\") pod \"cert-manager-545d4d4674-kcqh7\" (UID: \"d4ed7d5a-c42c-49fd-96ec-d22b136af504\") " pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.351471 master-0 kubenswrapper[29612]: I0319 12:21:23.351290 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4ed7d5a-c42c-49fd-96ec-d22b136af504-bound-sa-token\") pod \"cert-manager-545d4d4674-kcqh7\" (UID: \"d4ed7d5a-c42c-49fd-96ec-d22b136af504\") " pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.351471 master-0 kubenswrapper[29612]: I0319 12:21:23.351381 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q698z\" (UniqueName: \"kubernetes.io/projected/d4ed7d5a-c42c-49fd-96ec-d22b136af504-kube-api-access-q698z\") pod \"cert-manager-545d4d4674-kcqh7\" (UID: \"d4ed7d5a-c42c-49fd-96ec-d22b136af504\") " pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.372821 master-0 kubenswrapper[29612]: I0319 12:21:23.372738 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d4ed7d5a-c42c-49fd-96ec-d22b136af504-bound-sa-token\") pod \"cert-manager-545d4d4674-kcqh7\" (UID: \"d4ed7d5a-c42c-49fd-96ec-d22b136af504\") " pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.375768 master-0 kubenswrapper[29612]: I0319 12:21:23.375730 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q698z\" (UniqueName: \"kubernetes.io/projected/d4ed7d5a-c42c-49fd-96ec-d22b136af504-kube-api-access-q698z\") pod \"cert-manager-545d4d4674-kcqh7\" (UID: \"d4ed7d5a-c42c-49fd-96ec-d22b136af504\") " pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:23.416470 master-0 kubenswrapper[29612]: I0319 12:21:23.416396 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-kcqh7" Mar 19 12:21:28.723841 master-0 kubenswrapper[29612]: I0319 12:21:28.723714 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-zbdff"] Mar 19 12:21:28.725744 master-0 kubenswrapper[29612]: I0319 12:21:28.725718 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" Mar 19 12:21:28.730658 master-0 kubenswrapper[29612]: I0319 12:21:28.728448 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 12:21:28.730658 master-0 kubenswrapper[29612]: I0319 12:21:28.729031 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 12:21:28.745972 master-0 kubenswrapper[29612]: I0319 12:21:28.745890 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-zbdff"] Mar 19 12:21:28.870815 master-0 kubenswrapper[29612]: I0319 12:21:28.870762 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7sj\" (UniqueName: \"kubernetes.io/projected/9d5147e7-e913-4f3b-821f-a0ad31ddf449-kube-api-access-xl7sj\") pod \"obo-prometheus-operator-8ff7d675-zbdff\" (UID: \"9d5147e7-e913-4f3b-821f-a0ad31ddf449\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" Mar 19 12:21:28.972796 master-0 kubenswrapper[29612]: I0319 12:21:28.972732 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7sj\" (UniqueName: \"kubernetes.io/projected/9d5147e7-e913-4f3b-821f-a0ad31ddf449-kube-api-access-xl7sj\") pod \"obo-prometheus-operator-8ff7d675-zbdff\" (UID: \"9d5147e7-e913-4f3b-821f-a0ad31ddf449\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" Mar 19 12:21:28.999602 master-0 kubenswrapper[29612]: I0319 12:21:28.999443 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7sj\" (UniqueName: \"kubernetes.io/projected/9d5147e7-e913-4f3b-821f-a0ad31ddf449-kube-api-access-xl7sj\") pod \"obo-prometheus-operator-8ff7d675-zbdff\" (UID: \"9d5147e7-e913-4f3b-821f-a0ad31ddf449\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" Mar 19 12:21:29.061462 master-0 kubenswrapper[29612]: I0319 12:21:29.061391 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" Mar 19 12:21:29.243503 master-0 kubenswrapper[29612]: I0319 12:21:29.243430 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl"] Mar 19 12:21:29.246700 master-0 kubenswrapper[29612]: I0319 12:21:29.246623 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.249438 master-0 kubenswrapper[29612]: I0319 12:21:29.249323 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 12:21:29.253123 master-0 kubenswrapper[29612]: I0319 12:21:29.253002 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r"] Mar 19 12:21:29.254204 master-0 kubenswrapper[29612]: I0319 12:21:29.254180 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.301264 master-0 kubenswrapper[29612]: I0319 12:21:29.287342 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl"] Mar 19 12:21:29.324684 master-0 kubenswrapper[29612]: I0319 12:21:29.324617 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r"] Mar 19 12:21:29.381660 master-0 kubenswrapper[29612]: I0319 12:21:29.381569 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1c9058b-d7fe-4e66-b571-a525f2fe5917-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-vfj6r\" (UID: \"b1c9058b-d7fe-4e66-b571-a525f2fe5917\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.381660 master-0 kubenswrapper[29612]: I0319 12:21:29.381666 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f5e41b8-f2bf-4143-a518-aca6c46179f9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-k4cwl\" (UID: \"7f5e41b8-f2bf-4143-a518-aca6c46179f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.382075 master-0 kubenswrapper[29612]: I0319 12:21:29.381702 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f5e41b8-f2bf-4143-a518-aca6c46179f9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-k4cwl\" (UID: \"7f5e41b8-f2bf-4143-a518-aca6c46179f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.382075 master-0 kubenswrapper[29612]: I0319 12:21:29.381845 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1c9058b-d7fe-4e66-b571-a525f2fe5917-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-vfj6r\" (UID: \"b1c9058b-d7fe-4e66-b571-a525f2fe5917\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.494593 master-0 kubenswrapper[29612]: I0319 12:21:29.492957 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1c9058b-d7fe-4e66-b571-a525f2fe5917-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-vfj6r\" (UID: \"b1c9058b-d7fe-4e66-b571-a525f2fe5917\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.494593 master-0 kubenswrapper[29612]: I0319 12:21:29.493044 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f5e41b8-f2bf-4143-a518-aca6c46179f9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-k4cwl\" (UID: \"7f5e41b8-f2bf-4143-a518-aca6c46179f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.494593 master-0 kubenswrapper[29612]: I0319 12:21:29.493071 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f5e41b8-f2bf-4143-a518-aca6c46179f9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-k4cwl\" (UID: \"7f5e41b8-f2bf-4143-a518-aca6c46179f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.494593 master-0 kubenswrapper[29612]: I0319 12:21:29.493091 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1c9058b-d7fe-4e66-b571-a525f2fe5917-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-vfj6r\" (UID: \"b1c9058b-d7fe-4e66-b571-a525f2fe5917\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.500658 master-0 kubenswrapper[29612]: I0319 12:21:29.498732 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f5e41b8-f2bf-4143-a518-aca6c46179f9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-k4cwl\" (UID: \"7f5e41b8-f2bf-4143-a518-aca6c46179f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.500658 master-0 kubenswrapper[29612]: I0319 12:21:29.499563 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f5e41b8-f2bf-4143-a518-aca6c46179f9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-k4cwl\" (UID: \"7f5e41b8-f2bf-4143-a518-aca6c46179f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.500658 master-0 kubenswrapper[29612]: I0319 12:21:29.500098 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b1c9058b-d7fe-4e66-b571-a525f2fe5917-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-vfj6r\" (UID: \"b1c9058b-d7fe-4e66-b571-a525f2fe5917\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.503975 master-0 kubenswrapper[29612]: I0319 12:21:29.503889 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b1c9058b-d7fe-4e66-b571-a525f2fe5917-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956f878b-vfj6r\" (UID: \"b1c9058b-d7fe-4e66-b571-a525f2fe5917\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.625748 master-0 kubenswrapper[29612]: I0319 12:21:29.625155 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" Mar 19 12:21:29.633739 master-0 kubenswrapper[29612]: I0319 12:21:29.633692 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" Mar 19 12:21:29.685440 master-0 kubenswrapper[29612]: I0319 12:21:29.685380 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-bbzl5"] Mar 19 12:21:29.687732 master-0 kubenswrapper[29612]: I0319 12:21:29.687069 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:29.690383 master-0 kubenswrapper[29612]: I0319 12:21:29.689294 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 12:21:29.767783 master-0 kubenswrapper[29612]: I0319 12:21:29.766786 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-bbzl5"] Mar 19 12:21:29.853747 master-0 kubenswrapper[29612]: I0319 12:21:29.853559 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/04532843-615e-4709-a438-475612406846-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-bbzl5\" (UID: \"04532843-615e-4709-a438-475612406846\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:29.853747 master-0 kubenswrapper[29612]: I0319 12:21:29.853608 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2456\" (UniqueName: \"kubernetes.io/projected/04532843-615e-4709-a438-475612406846-kube-api-access-r2456\") pod \"observability-operator-6dd7dd855f-bbzl5\" (UID: \"04532843-615e-4709-a438-475612406846\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:29.959756 master-0 kubenswrapper[29612]: I0319 12:21:29.955433 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/04532843-615e-4709-a438-475612406846-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-bbzl5\" (UID: \"04532843-615e-4709-a438-475612406846\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:29.959756 master-0 kubenswrapper[29612]: I0319 12:21:29.955505 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2456\" (UniqueName: \"kubernetes.io/projected/04532843-615e-4709-a438-475612406846-kube-api-access-r2456\") pod \"observability-operator-6dd7dd855f-bbzl5\" (UID: \"04532843-615e-4709-a438-475612406846\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:29.972280 master-0 kubenswrapper[29612]: I0319 12:21:29.972126 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/04532843-615e-4709-a438-475612406846-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-bbzl5\" (UID: \"04532843-615e-4709-a438-475612406846\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:29.978466 master-0 kubenswrapper[29612]: I0319 12:21:29.974183 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2456\" (UniqueName: \"kubernetes.io/projected/04532843-615e-4709-a438-475612406846-kube-api-access-r2456\") pod \"observability-operator-6dd7dd855f-bbzl5\" (UID: \"04532843-615e-4709-a438-475612406846\") " pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:30.008810 master-0 kubenswrapper[29612]: I0319 12:21:30.005248 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:30.051369 master-0 kubenswrapper[29612]: I0319 12:21:30.051329 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-zbdff"] Mar 19 12:21:30.196366 master-0 kubenswrapper[29612]: I0319 12:21:30.186531 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-548c78f8ff-4pwj7"] Mar 19 12:21:30.196366 master-0 kubenswrapper[29612]: I0319 12:21:30.188320 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.196366 master-0 kubenswrapper[29612]: I0319 12:21:30.193022 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 12:21:30.196366 master-0 kubenswrapper[29612]: I0319 12:21:30.193574 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-548c78f8ff-4pwj7"] Mar 19 12:21:30.354723 master-0 kubenswrapper[29612]: I0319 12:21:30.352799 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-kcqh7"] Mar 19 12:21:30.354723 master-0 kubenswrapper[29612]: I0319 12:21:30.352881 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r"] Mar 19 12:21:30.445793 master-0 kubenswrapper[29612]: I0319 12:21:30.445729 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl"] Mar 19 12:21:30.457931 master-0 kubenswrapper[29612]: I0319 12:21:30.455587 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-openshift-service-ca\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.457931 master-0 kubenswrapper[29612]: I0319 12:21:30.455655 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-webhook-cert\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.457931 master-0 kubenswrapper[29612]: I0319 12:21:30.455697 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm4p5\" (UniqueName: \"kubernetes.io/projected/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-kube-api-access-rm4p5\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.457931 master-0 kubenswrapper[29612]: I0319 12:21:30.455712 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-apiservice-cert\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.561182 master-0 kubenswrapper[29612]: I0319 12:21:30.557080 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-openshift-service-ca\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.561182 master-0 kubenswrapper[29612]: I0319 12:21:30.557148 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-webhook-cert\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.561182 master-0 kubenswrapper[29612]: I0319 12:21:30.557200 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-apiservice-cert\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.561182 master-0 kubenswrapper[29612]: I0319 12:21:30.557220 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm4p5\" (UniqueName: \"kubernetes.io/projected/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-kube-api-access-rm4p5\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.564807 master-0 kubenswrapper[29612]: I0319 12:21:30.561561 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-openshift-service-ca\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.568690 master-0 kubenswrapper[29612]: I0319 12:21:30.567434 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-apiservice-cert\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.583732 master-0 kubenswrapper[29612]: I0319 12:21:30.579354 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-webhook-cert\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.620740 master-0 kubenswrapper[29612]: I0319 12:21:30.611291 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm4p5\" (UniqueName: \"kubernetes.io/projected/d80ffc36-7e3e-41ab-862f-6f36b7bec9af-kube-api-access-rm4p5\") pod \"perses-operator-548c78f8ff-4pwj7\" (UID: \"d80ffc36-7e3e-41ab-862f-6f36b7bec9af\") " pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:30.688669 master-0 kubenswrapper[29612]: I0319 12:21:30.687242 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-bbzl5"] Mar 19 12:21:30.896647 master-0 kubenswrapper[29612]: I0319 12:21:30.896068 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:31.072659 master-0 kubenswrapper[29612]: I0319 12:21:31.071134 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" event={"ID":"28a635ab-5a06-492e-9681-dfe7067424f4","Type":"ContainerStarted","Data":"d9164569637a8fe2f8abcb2063914208184f5ff4825bb128d37682fbd34014aa"} Mar 19 12:21:31.072659 master-0 kubenswrapper[29612]: I0319 12:21:31.072055 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:31.079546 master-0 kubenswrapper[29612]: I0319 12:21:31.073894 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" event={"ID":"7f5e41b8-f2bf-4143-a518-aca6c46179f9","Type":"ContainerStarted","Data":"0f5cb42d08ad44fb45de6694e0e4cd5892524d166894a58689b163461418c006"} Mar 19 12:21:31.079546 master-0 kubenswrapper[29612]: I0319 12:21:31.075991 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" event={"ID":"04532843-615e-4709-a438-475612406846","Type":"ContainerStarted","Data":"975a5fe91cc99ea0054ed55ddc5e63b47aa9b599d48c59ecdd81f5bec4c04a17"} Mar 19 12:21:31.079546 master-0 kubenswrapper[29612]: I0319 12:21:31.077957 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-kcqh7" event={"ID":"d4ed7d5a-c42c-49fd-96ec-d22b136af504","Type":"ContainerStarted","Data":"133dee5d4c9cb8846c3390fbb8633c7ee465ff274c7d5baceba80cf4fb8dead8"} Mar 19 12:21:31.079546 master-0 kubenswrapper[29612]: I0319 12:21:31.078290 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-kcqh7" event={"ID":"d4ed7d5a-c42c-49fd-96ec-d22b136af504","Type":"ContainerStarted","Data":"2376b0bf6874304820a438ac966f2fc97247816bd44d929ec55339715e51731a"} Mar 19 12:21:31.099653 master-0 kubenswrapper[29612]: I0319 12:21:31.099564 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" event={"ID":"a6353c91-3e4d-4849-8628-a14850facf4d","Type":"ContainerStarted","Data":"bf79ee6678dcecffc71b37022f293f25c0f71826b241ffd151719bd733594350"} Mar 19 12:21:31.102550 master-0 kubenswrapper[29612]: I0319 12:21:31.102163 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:21:31.117089 master-0 kubenswrapper[29612]: I0319 12:21:31.117009 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" event={"ID":"afca29b3-d73a-4c1e-b610-7c9db42ffe99","Type":"ContainerStarted","Data":"4bcc165195d159feb7d161f4da6f45d5b5a085532494e00e6fa73ab2ba368304"} Mar 19 12:21:31.122424 master-0 kubenswrapper[29612]: I0319 12:21:31.122355 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" event={"ID":"2d19589e-6eec-4a65-9799-76abd82f3b20","Type":"ContainerStarted","Data":"98b027947268dd6840c65ffafd7ebfd2e60123ad08ed5c10fff6ac89756d89e5"} Mar 19 12:21:31.123074 master-0 kubenswrapper[29612]: I0319 12:21:31.122728 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:31.124724 master-0 kubenswrapper[29612]: I0319 12:21:31.124215 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" event={"ID":"b1c9058b-d7fe-4e66-b571-a525f2fe5917","Type":"ContainerStarted","Data":"a47e2e359ee6795da3ff15ac77f14f697fe908faaa495ed92d59d3d47b926d8c"} Mar 19 12:21:31.126742 master-0 kubenswrapper[29612]: I0319 12:21:31.125983 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" event={"ID":"9d5147e7-e913-4f3b-821f-a0ad31ddf449","Type":"ContainerStarted","Data":"127971b2d6e36781b849b5bd857e3b5d79de6241d3aa5cb49362eaeed4edf157"} Mar 19 12:21:31.166750 master-0 kubenswrapper[29612]: I0319 12:21:31.155592 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" podStartSLOduration=2.67732707 podStartE2EDuration="16.155573852s" podCreationTimestamp="2026-03-19 12:21:15 +0000 UTC" firstStartedPulling="2026-03-19 12:21:16.423219264 +0000 UTC m=+683.737388295" lastFinishedPulling="2026-03-19 12:21:29.901466066 +0000 UTC m=+697.215635077" observedRunningTime="2026-03-19 12:21:31.095466325 +0000 UTC m=+698.409635346" watchObservedRunningTime="2026-03-19 12:21:31.155573852 +0000 UTC m=+698.469742863" Mar 19 12:21:31.166750 master-0 kubenswrapper[29612]: I0319 12:21:31.156304 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-kcqh7" podStartSLOduration=8.156298603 podStartE2EDuration="8.156298603s" podCreationTimestamp="2026-03-19 12:21:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:21:31.119457516 +0000 UTC m=+698.433626537" watchObservedRunningTime="2026-03-19 12:21:31.156298603 +0000 UTC m=+698.470467624" Mar 19 12:21:31.192156 master-0 kubenswrapper[29612]: I0319 12:21:31.186013 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" podStartSLOduration=2.936235903 podStartE2EDuration="17.185995936s" podCreationTimestamp="2026-03-19 12:21:14 +0000 UTC" firstStartedPulling="2026-03-19 12:21:15.313863009 +0000 UTC m=+682.628032030" lastFinishedPulling="2026-03-19 12:21:29.563623042 +0000 UTC m=+696.877792063" observedRunningTime="2026-03-19 12:21:31.14425226 +0000 UTC m=+698.458421281" watchObservedRunningTime="2026-03-19 12:21:31.185995936 +0000 UTC m=+698.500164957" Mar 19 12:21:31.192156 master-0 kubenswrapper[29612]: I0319 12:21:31.188147 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-bhb5z" podStartSLOduration=3.588396864 podStartE2EDuration="17.188129486s" podCreationTimestamp="2026-03-19 12:21:14 +0000 UTC" firstStartedPulling="2026-03-19 12:21:16.16742152 +0000 UTC m=+683.481590541" lastFinishedPulling="2026-03-19 12:21:29.767154142 +0000 UTC m=+697.081323163" observedRunningTime="2026-03-19 12:21:31.171267708 +0000 UTC m=+698.485436729" watchObservedRunningTime="2026-03-19 12:21:31.188129486 +0000 UTC m=+698.502298507" Mar 19 12:21:31.206047 master-0 kubenswrapper[29612]: I0319 12:21:31.197236 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" podStartSLOduration=3.177039704 podStartE2EDuration="19.197214364s" podCreationTimestamp="2026-03-19 12:21:12 +0000 UTC" firstStartedPulling="2026-03-19 12:21:13.847116185 +0000 UTC m=+681.161285206" lastFinishedPulling="2026-03-19 12:21:29.867290845 +0000 UTC m=+697.181459866" observedRunningTime="2026-03-19 12:21:31.191659867 +0000 UTC m=+698.505828898" watchObservedRunningTime="2026-03-19 12:21:31.197214364 +0000 UTC m=+698.511383395" Mar 19 12:21:31.371124 master-0 kubenswrapper[29612]: W0319 12:21:31.371072 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd80ffc36_7e3e_41ab_862f_6f36b7bec9af.slice/crio-a9563ed99f5e1af6fcb25fb65f8906c86b4c260013e53098f01774eb7d440411 WatchSource:0}: Error finding container a9563ed99f5e1af6fcb25fb65f8906c86b4c260013e53098f01774eb7d440411: Status 404 returned error can't find the container with id a9563ed99f5e1af6fcb25fb65f8906c86b4c260013e53098f01774eb7d440411 Mar 19 12:21:31.379560 master-0 kubenswrapper[29612]: I0319 12:21:31.379509 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-548c78f8ff-4pwj7"] Mar 19 12:21:32.146662 master-0 kubenswrapper[29612]: I0319 12:21:32.145921 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" event={"ID":"d80ffc36-7e3e-41ab-862f-6f36b7bec9af","Type":"ContainerStarted","Data":"a9563ed99f5e1af6fcb25fb65f8906c86b4c260013e53098f01774eb7d440411"} Mar 19 12:21:38.146991 master-0 kubenswrapper[29612]: I0319 12:21:38.146728 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-bp5fk" Mar 19 12:21:40.221948 master-0 kubenswrapper[29612]: I0319 12:21:40.221869 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" event={"ID":"b1c9058b-d7fe-4e66-b571-a525f2fe5917","Type":"ContainerStarted","Data":"eef50ec90e2ea06ea4df7032b59d33361e6a492a3c5bf9b47c7d7927495234a2"} Mar 19 12:21:40.233353 master-0 kubenswrapper[29612]: I0319 12:21:40.233296 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" event={"ID":"9d5147e7-e913-4f3b-821f-a0ad31ddf449","Type":"ContainerStarted","Data":"1e8131d6ceed204f40cadca490bf0c571b9b5c63afe9d6ac7fdc17353de7100d"} Mar 19 12:21:40.253581 master-0 kubenswrapper[29612]: I0319 12:21:40.252732 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" event={"ID":"04532843-615e-4709-a438-475612406846","Type":"ContainerStarted","Data":"757bf137a8860f984b36d9b0c16bfd05c51b6873c9c0d326522cede85e126d0d"} Mar 19 12:21:40.257683 master-0 kubenswrapper[29612]: I0319 12:21:40.254359 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:40.257683 master-0 kubenswrapper[29612]: I0319 12:21:40.255192 29612 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-bbzl5 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.128.0.132:8081/healthz\": dial tcp 10.128.0.132:8081: connect: connection refused" start-of-body= Mar 19 12:21:40.257683 master-0 kubenswrapper[29612]: I0319 12:21:40.255232 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" podUID="04532843-615e-4709-a438-475612406846" containerName="operator" probeResult="failure" output="Get \"http://10.128.0.132:8081/healthz\": dial tcp 10.128.0.132:8081: connect: connection refused" Mar 19 12:21:40.268683 master-0 kubenswrapper[29612]: I0319 12:21:40.263457 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" event={"ID":"7f5e41b8-f2bf-4143-a518-aca6c46179f9","Type":"ContainerStarted","Data":"fe3a929fe737f8a8a4823c26008df7a4ceb0f338bb461c3b2c4390282806cc5d"} Mar 19 12:21:40.270470 master-0 kubenswrapper[29612]: I0319 12:21:40.270413 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" event={"ID":"d80ffc36-7e3e-41ab-862f-6f36b7bec9af","Type":"ContainerStarted","Data":"b0f855b22f9aec0456b802eb9aeac3f8107cd25a341cd8f1fcc432e0de8c6da7"} Mar 19 12:21:40.275699 master-0 kubenswrapper[29612]: I0319 12:21:40.271329 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:21:40.288665 master-0 kubenswrapper[29612]: I0319 12:21:40.286568 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-vfj6r" podStartSLOduration=1.9432028300000002 podStartE2EDuration="11.286531555s" podCreationTimestamp="2026-03-19 12:21:29 +0000 UTC" firstStartedPulling="2026-03-19 12:21:30.390501044 +0000 UTC m=+697.704670065" lastFinishedPulling="2026-03-19 12:21:39.733829759 +0000 UTC m=+707.047998790" observedRunningTime="2026-03-19 12:21:40.268116352 +0000 UTC m=+707.582285363" watchObservedRunningTime="2026-03-19 12:21:40.286531555 +0000 UTC m=+707.600700576" Mar 19 12:21:40.358331 master-0 kubenswrapper[29612]: I0319 12:21:40.341485 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" podStartSLOduration=2.278534564 podStartE2EDuration="11.341458455s" podCreationTimestamp="2026-03-19 12:21:29 +0000 UTC" firstStartedPulling="2026-03-19 12:21:30.718600732 +0000 UTC m=+698.032769753" lastFinishedPulling="2026-03-19 12:21:39.781524623 +0000 UTC m=+707.095693644" observedRunningTime="2026-03-19 12:21:40.32544419 +0000 UTC m=+707.639613211" watchObservedRunningTime="2026-03-19 12:21:40.341458455 +0000 UTC m=+707.655627476" Mar 19 12:21:40.509748 master-0 kubenswrapper[29612]: I0319 12:21:40.495066 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956f878b-k4cwl" podStartSLOduration=2.204429928 podStartE2EDuration="11.495034825s" podCreationTimestamp="2026-03-19 12:21:29 +0000 UTC" firstStartedPulling="2026-03-19 12:21:30.463371574 +0000 UTC m=+697.777540595" lastFinishedPulling="2026-03-19 12:21:39.753976471 +0000 UTC m=+707.068145492" observedRunningTime="2026-03-19 12:21:40.47723343 +0000 UTC m=+707.791402451" watchObservedRunningTime="2026-03-19 12:21:40.495034825 +0000 UTC m=+707.809203846" Mar 19 12:21:40.513788 master-0 kubenswrapper[29612]: I0319 12:21:40.513599 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-zbdff" podStartSLOduration=2.904188581 podStartE2EDuration="12.513558841s" podCreationTimestamp="2026-03-19 12:21:28 +0000 UTC" firstStartedPulling="2026-03-19 12:21:30.124223792 +0000 UTC m=+697.438392803" lastFinishedPulling="2026-03-19 12:21:39.733594042 +0000 UTC m=+707.047763063" observedRunningTime="2026-03-19 12:21:40.404262237 +0000 UTC m=+707.718431258" watchObservedRunningTime="2026-03-19 12:21:40.513558841 +0000 UTC m=+707.827727862" Mar 19 12:21:41.280457 master-0 kubenswrapper[29612]: I0319 12:21:41.280390 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-bbzl5" Mar 19 12:21:41.307080 master-0 kubenswrapper[29612]: I0319 12:21:41.306975 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" podStartSLOduration=2.9476562250000002 podStartE2EDuration="11.306950733s" podCreationTimestamp="2026-03-19 12:21:30 +0000 UTC" firstStartedPulling="2026-03-19 12:21:31.373930693 +0000 UTC m=+698.688099714" lastFinishedPulling="2026-03-19 12:21:39.733225201 +0000 UTC m=+707.047394222" observedRunningTime="2026-03-19 12:21:40.554775562 +0000 UTC m=+707.868944583" watchObservedRunningTime="2026-03-19 12:21:41.306950733 +0000 UTC m=+708.621119754" Mar 19 12:21:45.695278 master-0 kubenswrapper[29612]: I0319 12:21:45.695226 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-69f8686c44-wrfdp" Mar 19 12:21:50.900430 master-0 kubenswrapper[29612]: I0319 12:21:50.900382 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-548c78f8ff-4pwj7" Mar 19 12:22:04.924171 master-0 kubenswrapper[29612]: I0319 12:22:04.924125 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6f6b9794d7-h9td4" Mar 19 12:22:13.630188 master-0 kubenswrapper[29612]: I0319 12:22:13.630117 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg"] Mar 19 12:22:13.631572 master-0 kubenswrapper[29612]: I0319 12:22:13.631543 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.639662 master-0 kubenswrapper[29612]: I0319 12:22:13.639588 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 12:22:13.654108 master-0 kubenswrapper[29612]: I0319 12:22:13.654039 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-klcpp"] Mar 19 12:22:13.663521 master-0 kubenswrapper[29612]: I0319 12:22:13.663129 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg"] Mar 19 12:22:13.663521 master-0 kubenswrapper[29612]: I0319 12:22:13.663312 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.665945 master-0 kubenswrapper[29612]: I0319 12:22:13.665715 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 12:22:13.665945 master-0 kubenswrapper[29612]: I0319 12:22:13.665857 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 12:22:13.702743 master-0 kubenswrapper[29612]: I0319 12:22:13.699934 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0c5e02e-a5c3-485f-8beb-eda8e58c0eba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pt6xg\" (UID: \"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.702967 master-0 kubenswrapper[29612]: I0319 12:22:13.702826 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/931bbab5-c031-4583-84f0-8b8e236df593-frr-startup\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.702967 master-0 kubenswrapper[29612]: I0319 12:22:13.702887 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvxt4\" (UniqueName: \"kubernetes.io/projected/931bbab5-c031-4583-84f0-8b8e236df593-kube-api-access-kvxt4\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.702967 master-0 kubenswrapper[29612]: I0319 12:22:13.702953 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw86b\" (UniqueName: \"kubernetes.io/projected/c0c5e02e-a5c3-485f-8beb-eda8e58c0eba-kube-api-access-pw86b\") pod \"frr-k8s-webhook-server-bcc4b6f68-pt6xg\" (UID: \"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.703073 master-0 kubenswrapper[29612]: I0319 12:22:13.702980 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-metrics\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.703073 master-0 kubenswrapper[29612]: I0319 12:22:13.703023 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-frr-conf\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.703195 master-0 kubenswrapper[29612]: I0319 12:22:13.703177 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-frr-sockets\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.703258 master-0 kubenswrapper[29612]: I0319 12:22:13.703217 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931bbab5-c031-4583-84f0-8b8e236df593-metrics-certs\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.703321 master-0 kubenswrapper[29612]: I0319 12:22:13.703301 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-reloader\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804283 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-reloader\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804361 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0c5e02e-a5c3-485f-8beb-eda8e58c0eba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pt6xg\" (UID: \"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804582 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/931bbab5-c031-4583-84f0-8b8e236df593-frr-startup\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804619 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvxt4\" (UniqueName: \"kubernetes.io/projected/931bbab5-c031-4583-84f0-8b8e236df593-kube-api-access-kvxt4\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804677 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw86b\" (UniqueName: \"kubernetes.io/projected/c0c5e02e-a5c3-485f-8beb-eda8e58c0eba-kube-api-access-pw86b\") pod \"frr-k8s-webhook-server-bcc4b6f68-pt6xg\" (UID: \"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804704 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-metrics\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804736 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-frr-conf\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804808 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-frr-sockets\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804835 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931bbab5-c031-4583-84f0-8b8e236df593-metrics-certs\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.805895 master-0 kubenswrapper[29612]: I0319 12:22:13.804979 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-reloader\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.806450 master-0 kubenswrapper[29612]: I0319 12:22:13.806004 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-frr-conf\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.806450 master-0 kubenswrapper[29612]: I0319 12:22:13.806339 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/931bbab5-c031-4583-84f0-8b8e236df593-frr-startup\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.806450 master-0 kubenswrapper[29612]: I0319 12:22:13.806385 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-metrics\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.808480 master-0 kubenswrapper[29612]: I0319 12:22:13.806773 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/931bbab5-c031-4583-84f0-8b8e236df593-frr-sockets\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.810754 master-0 kubenswrapper[29612]: I0319 12:22:13.808862 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/931bbab5-c031-4583-84f0-8b8e236df593-metrics-certs\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.812204 master-0 kubenswrapper[29612]: I0319 12:22:13.812170 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0c5e02e-a5c3-485f-8beb-eda8e58c0eba-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pt6xg\" (UID: \"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.847258 master-0 kubenswrapper[29612]: I0319 12:22:13.847191 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw86b\" (UniqueName: \"kubernetes.io/projected/c0c5e02e-a5c3-485f-8beb-eda8e58c0eba-kube-api-access-pw86b\") pod \"frr-k8s-webhook-server-bcc4b6f68-pt6xg\" (UID: \"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.848442 master-0 kubenswrapper[29612]: I0319 12:22:13.848385 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvxt4\" (UniqueName: \"kubernetes.io/projected/931bbab5-c031-4583-84f0-8b8e236df593-kube-api-access-kvxt4\") pod \"frr-k8s-klcpp\" (UID: \"931bbab5-c031-4583-84f0-8b8e236df593\") " pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:13.899406 master-0 kubenswrapper[29612]: I0319 12:22:13.899321 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8jwqz"] Mar 19 12:22:13.903187 master-0 kubenswrapper[29612]: I0319 12:22:13.903146 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8jwqz" Mar 19 12:22:13.905613 master-0 kubenswrapper[29612]: I0319 12:22:13.905564 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:13.905732 master-0 kubenswrapper[29612]: I0319 12:22:13.905616 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metrics-certs\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:13.906031 master-0 kubenswrapper[29612]: I0319 12:22:13.905975 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx67f\" (UniqueName: \"kubernetes.io/projected/3a2d92a4-7840-44a5-964f-afaf6afa67fb-kube-api-access-kx67f\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:13.906123 master-0 kubenswrapper[29612]: I0319 12:22:13.906046 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metallb-excludel2\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:13.906715 master-0 kubenswrapper[29612]: I0319 12:22:13.906682 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 12:22:13.907117 master-0 kubenswrapper[29612]: I0319 12:22:13.907062 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-ndv4z"] Mar 19 12:22:13.907604 master-0 kubenswrapper[29612]: I0319 12:22:13.907570 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 12:22:13.908380 master-0 kubenswrapper[29612]: I0319 12:22:13.908347 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:13.909718 master-0 kubenswrapper[29612]: I0319 12:22:13.909677 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 12:22:13.910306 master-0 kubenswrapper[29612]: I0319 12:22:13.910274 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 12:22:13.948488 master-0 kubenswrapper[29612]: I0319 12:22:13.948411 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:13.961552 master-0 kubenswrapper[29612]: I0319 12:22:13.961501 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-ndv4z"] Mar 19 12:22:13.988617 master-0 kubenswrapper[29612]: I0319 12:22:13.988568 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:14.007456 master-0 kubenswrapper[29612]: I0319 12:22:14.007381 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42892640-180b-417d-bae0-e7823b91ec40-cert\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.007675 master-0 kubenswrapper[29612]: I0319 12:22:14.007475 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q46tl\" (UniqueName: \"kubernetes.io/projected/42892640-180b-417d-bae0-e7823b91ec40-kube-api-access-q46tl\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.007675 master-0 kubenswrapper[29612]: I0319 12:22:14.007517 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx67f\" (UniqueName: \"kubernetes.io/projected/3a2d92a4-7840-44a5-964f-afaf6afa67fb-kube-api-access-kx67f\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.007675 master-0 kubenswrapper[29612]: I0319 12:22:14.007546 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metallb-excludel2\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.007901 master-0 kubenswrapper[29612]: I0319 12:22:14.007744 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.007901 master-0 kubenswrapper[29612]: I0319 12:22:14.007780 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metrics-certs\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.007901 master-0 kubenswrapper[29612]: I0319 12:22:14.007870 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42892640-180b-417d-bae0-e7823b91ec40-metrics-certs\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.008318 master-0 kubenswrapper[29612]: E0319 12:22:14.008279 29612 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 19 12:22:14.008422 master-0 kubenswrapper[29612]: E0319 12:22:14.008365 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metrics-certs podName:3a2d92a4-7840-44a5-964f-afaf6afa67fb nodeName:}" failed. No retries permitted until 2026-03-19 12:22:14.50834612 +0000 UTC m=+741.822515141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metrics-certs") pod "speaker-8jwqz" (UID: "3a2d92a4-7840-44a5-964f-afaf6afa67fb") : secret "speaker-certs-secret" not found Mar 19 12:22:14.008422 master-0 kubenswrapper[29612]: E0319 12:22:14.008292 29612 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 12:22:14.008548 master-0 kubenswrapper[29612]: E0319 12:22:14.008442 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist podName:3a2d92a4-7840-44a5-964f-afaf6afa67fb nodeName:}" failed. No retries permitted until 2026-03-19 12:22:14.508421132 +0000 UTC m=+741.822590233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist") pod "speaker-8jwqz" (UID: "3a2d92a4-7840-44a5-964f-afaf6afa67fb") : secret "metallb-memberlist" not found Mar 19 12:22:14.008945 master-0 kubenswrapper[29612]: I0319 12:22:14.008898 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metallb-excludel2\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.068539 master-0 kubenswrapper[29612]: I0319 12:22:14.068490 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx67f\" (UniqueName: \"kubernetes.io/projected/3a2d92a4-7840-44a5-964f-afaf6afa67fb-kube-api-access-kx67f\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.110928 master-0 kubenswrapper[29612]: I0319 12:22:14.109052 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q46tl\" (UniqueName: \"kubernetes.io/projected/42892640-180b-417d-bae0-e7823b91ec40-kube-api-access-q46tl\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.110928 master-0 kubenswrapper[29612]: I0319 12:22:14.109248 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42892640-180b-417d-bae0-e7823b91ec40-metrics-certs\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.110928 master-0 kubenswrapper[29612]: I0319 12:22:14.109292 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42892640-180b-417d-bae0-e7823b91ec40-cert\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.112621 master-0 kubenswrapper[29612]: I0319 12:22:14.112233 29612 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 12:22:14.120110 master-0 kubenswrapper[29612]: I0319 12:22:14.120058 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42892640-180b-417d-bae0-e7823b91ec40-metrics-certs\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.123068 master-0 kubenswrapper[29612]: I0319 12:22:14.123031 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42892640-180b-417d-bae0-e7823b91ec40-cert\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.133221 master-0 kubenswrapper[29612]: I0319 12:22:14.132744 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q46tl\" (UniqueName: \"kubernetes.io/projected/42892640-180b-417d-bae0-e7823b91ec40-kube-api-access-q46tl\") pod \"controller-7bb4cc7c98-ndv4z\" (UID: \"42892640-180b-417d-bae0-e7823b91ec40\") " pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.227112 master-0 kubenswrapper[29612]: I0319 12:22:14.227054 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:14.413612 master-0 kubenswrapper[29612]: I0319 12:22:14.413376 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg"] Mar 19 12:22:14.521082 master-0 kubenswrapper[29612]: I0319 12:22:14.521013 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.521294 master-0 kubenswrapper[29612]: I0319 12:22:14.521110 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metrics-certs\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.521836 master-0 kubenswrapper[29612]: E0319 12:22:14.521563 29612 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 12:22:14.521836 master-0 kubenswrapper[29612]: E0319 12:22:14.521671 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist podName:3a2d92a4-7840-44a5-964f-afaf6afa67fb nodeName:}" failed. No retries permitted until 2026-03-19 12:22:15.521647767 +0000 UTC m=+742.835816808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist") pod "speaker-8jwqz" (UID: "3a2d92a4-7840-44a5-964f-afaf6afa67fb") : secret "metallb-memberlist" not found Mar 19 12:22:14.524863 master-0 kubenswrapper[29612]: I0319 12:22:14.524834 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-metrics-certs\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:14.556716 master-0 kubenswrapper[29612]: I0319 12:22:14.556586 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"4f3fe2d9caa0fe0b38f236b9d3d6589b923840bff39a57f3703d85a41b7ec91a"} Mar 19 12:22:14.558224 master-0 kubenswrapper[29612]: I0319 12:22:14.558166 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" event={"ID":"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba","Type":"ContainerStarted","Data":"68c1f1074e86f933eb59b937858d29d90b984a7ec8b656c075d60f5fba864b00"} Mar 19 12:22:14.671185 master-0 kubenswrapper[29612]: I0319 12:22:14.671129 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-ndv4z"] Mar 19 12:22:14.673275 master-0 kubenswrapper[29612]: W0319 12:22:14.673218 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42892640_180b_417d_bae0_e7823b91ec40.slice/crio-26132327ca89d7353bf70e963a3e6b2046e9846e1229ba022d2ba0a3d5707ba2 WatchSource:0}: Error finding container 26132327ca89d7353bf70e963a3e6b2046e9846e1229ba022d2ba0a3d5707ba2: Status 404 returned error can't find the container with id 26132327ca89d7353bf70e963a3e6b2046e9846e1229ba022d2ba0a3d5707ba2 Mar 19 12:22:15.156037 master-0 kubenswrapper[29612]: I0319 12:22:15.155936 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d"] Mar 19 12:22:15.158044 master-0 kubenswrapper[29612]: I0319 12:22:15.157983 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" Mar 19 12:22:15.165482 master-0 kubenswrapper[29612]: I0319 12:22:15.165411 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w"] Mar 19 12:22:15.167187 master-0 kubenswrapper[29612]: I0319 12:22:15.167132 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.169361 master-0 kubenswrapper[29612]: I0319 12:22:15.169317 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 12:22:15.174197 master-0 kubenswrapper[29612]: I0319 12:22:15.174152 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d"] Mar 19 12:22:15.225503 master-0 kubenswrapper[29612]: I0319 12:22:15.225454 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w"] Mar 19 12:22:15.232361 master-0 kubenswrapper[29612]: I0319 12:22:15.232318 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxczw\" (UniqueName: \"kubernetes.io/projected/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-kube-api-access-lxczw\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.232560 master-0 kubenswrapper[29612]: I0319 12:22:15.232395 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.232560 master-0 kubenswrapper[29612]: I0319 12:22:15.232437 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjg5d\" (UniqueName: \"kubernetes.io/projected/ad53604e-2d6b-4169-9a4f-e4c878098a24-kube-api-access-zjg5d\") pod \"nmstate-metrics-9b8c8685d-v9s4d\" (UID: \"ad53604e-2d6b-4169-9a4f-e4c878098a24\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" Mar 19 12:22:15.237062 master-0 kubenswrapper[29612]: I0319 12:22:15.236998 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-xjs2f"] Mar 19 12:22:15.238249 master-0 kubenswrapper[29612]: I0319 12:22:15.238217 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.334696 master-0 kubenswrapper[29612]: I0319 12:22:15.334608 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.335039 master-0 kubenswrapper[29612]: I0319 12:22:15.335020 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-nmstate-lock\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.335210 master-0 kubenswrapper[29612]: I0319 12:22:15.335185 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-dbus-socket\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.335312 master-0 kubenswrapper[29612]: I0319 12:22:15.335298 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjg5d\" (UniqueName: \"kubernetes.io/projected/ad53604e-2d6b-4169-9a4f-e4c878098a24-kube-api-access-zjg5d\") pod \"nmstate-metrics-9b8c8685d-v9s4d\" (UID: \"ad53604e-2d6b-4169-9a4f-e4c878098a24\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" Mar 19 12:22:15.335479 master-0 kubenswrapper[29612]: I0319 12:22:15.335466 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-ovs-socket\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.335674 master-0 kubenswrapper[29612]: I0319 12:22:15.335660 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxczw\" (UniqueName: \"kubernetes.io/projected/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-kube-api-access-lxczw\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.335769 master-0 kubenswrapper[29612]: I0319 12:22:15.335755 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsshm\" (UniqueName: \"kubernetes.io/projected/c58c9b23-392b-4b14-9b28-89736d899d6d-kube-api-access-wsshm\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.335951 master-0 kubenswrapper[29612]: E0319 12:22:15.334784 29612 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 12:22:15.336078 master-0 kubenswrapper[29612]: E0319 12:22:15.336066 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-tls-key-pair podName:e94c395a-afd4-4b0e-b796-b09cb0eb6f51 nodeName:}" failed. No retries permitted until 2026-03-19 12:22:15.836047426 +0000 UTC m=+743.150216447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-tls-key-pair") pod "nmstate-webhook-5f558f5558-6qz5w" (UID: "e94c395a-afd4-4b0e-b796-b09cb0eb6f51") : secret "openshift-nmstate-webhook" not found Mar 19 12:22:15.360141 master-0 kubenswrapper[29612]: I0319 12:22:15.360100 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjg5d\" (UniqueName: \"kubernetes.io/projected/ad53604e-2d6b-4169-9a4f-e4c878098a24-kube-api-access-zjg5d\") pod \"nmstate-metrics-9b8c8685d-v9s4d\" (UID: \"ad53604e-2d6b-4169-9a4f-e4c878098a24\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" Mar 19 12:22:15.361386 master-0 kubenswrapper[29612]: I0319 12:22:15.361319 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxczw\" (UniqueName: \"kubernetes.io/projected/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-kube-api-access-lxczw\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.392117 master-0 kubenswrapper[29612]: I0319 12:22:15.391896 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6"] Mar 19 12:22:15.393468 master-0 kubenswrapper[29612]: I0319 12:22:15.393251 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.409549 master-0 kubenswrapper[29612]: I0319 12:22:15.395761 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 12:22:15.409549 master-0 kubenswrapper[29612]: I0319 12:22:15.396651 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 12:22:15.420275 master-0 kubenswrapper[29612]: I0319 12:22:15.419114 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6"] Mar 19 12:22:15.440330 master-0 kubenswrapper[29612]: I0319 12:22:15.440284 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-ovs-socket\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.440441 master-0 kubenswrapper[29612]: I0319 12:22:15.440354 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e33e37d7-1911-4149-8382-6b009a1b4bbe-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.440441 master-0 kubenswrapper[29612]: I0319 12:22:15.440397 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsshm\" (UniqueName: \"kubernetes.io/projected/c58c9b23-392b-4b14-9b28-89736d899d6d-kube-api-access-wsshm\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.440510 master-0 kubenswrapper[29612]: I0319 12:22:15.440446 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e33e37d7-1911-4149-8382-6b009a1b4bbe-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.440510 master-0 kubenswrapper[29612]: I0319 12:22:15.440501 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-nmstate-lock\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.440573 master-0 kubenswrapper[29612]: I0319 12:22:15.440516 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-dbus-socket\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.440573 master-0 kubenswrapper[29612]: I0319 12:22:15.440535 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4szl\" (UniqueName: \"kubernetes.io/projected/e33e37d7-1911-4149-8382-6b009a1b4bbe-kube-api-access-n4szl\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.440672 master-0 kubenswrapper[29612]: I0319 12:22:15.440627 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-ovs-socket\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.440878 master-0 kubenswrapper[29612]: I0319 12:22:15.440823 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-nmstate-lock\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.441395 master-0 kubenswrapper[29612]: I0319 12:22:15.441363 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c58c9b23-392b-4b14-9b28-89736d899d6d-dbus-socket\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.473701 master-0 kubenswrapper[29612]: I0319 12:22:15.472337 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsshm\" (UniqueName: \"kubernetes.io/projected/c58c9b23-392b-4b14-9b28-89736d899d6d-kube-api-access-wsshm\") pod \"nmstate-handler-xjs2f\" (UID: \"c58c9b23-392b-4b14-9b28-89736d899d6d\") " pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.482102 master-0 kubenswrapper[29612]: I0319 12:22:15.480766 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" Mar 19 12:22:15.542822 master-0 kubenswrapper[29612]: I0319 12:22:15.542763 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4szl\" (UniqueName: \"kubernetes.io/projected/e33e37d7-1911-4149-8382-6b009a1b4bbe-kube-api-access-n4szl\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.543085 master-0 kubenswrapper[29612]: I0319 12:22:15.542866 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:15.543085 master-0 kubenswrapper[29612]: I0319 12:22:15.542899 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e33e37d7-1911-4149-8382-6b009a1b4bbe-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.543214 master-0 kubenswrapper[29612]: I0319 12:22:15.543107 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e33e37d7-1911-4149-8382-6b009a1b4bbe-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.543265 master-0 kubenswrapper[29612]: E0319 12:22:15.543236 29612 secret.go:189] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 12:22:15.543319 master-0 kubenswrapper[29612]: E0319 12:22:15.543282 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e33e37d7-1911-4149-8382-6b009a1b4bbe-plugin-serving-cert podName:e33e37d7-1911-4149-8382-6b009a1b4bbe nodeName:}" failed. No retries permitted until 2026-03-19 12:22:16.043269011 +0000 UTC m=+743.357438032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e33e37d7-1911-4149-8382-6b009a1b4bbe-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-7lbg6" (UID: "e33e37d7-1911-4149-8382-6b009a1b4bbe") : secret "plugin-serving-cert" not found Mar 19 12:22:15.544677 master-0 kubenswrapper[29612]: I0319 12:22:15.543992 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e33e37d7-1911-4149-8382-6b009a1b4bbe-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.545801 master-0 kubenswrapper[29612]: I0319 12:22:15.545757 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3a2d92a4-7840-44a5-964f-afaf6afa67fb-memberlist\") pod \"speaker-8jwqz\" (UID: \"3a2d92a4-7840-44a5-964f-afaf6afa67fb\") " pod="metallb-system/speaker-8jwqz" Mar 19 12:22:15.566209 master-0 kubenswrapper[29612]: I0319 12:22:15.566140 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-ndv4z" event={"ID":"42892640-180b-417d-bae0-e7823b91ec40","Type":"ContainerStarted","Data":"5544c61718e8355cdaed52460147f8319c0d62d3af6ca63ecd697d143744f080"} Mar 19 12:22:15.566209 master-0 kubenswrapper[29612]: I0319 12:22:15.566199 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-ndv4z" event={"ID":"42892640-180b-417d-bae0-e7823b91ec40","Type":"ContainerStarted","Data":"26132327ca89d7353bf70e963a3e6b2046e9846e1229ba022d2ba0a3d5707ba2"} Mar 19 12:22:15.601208 master-0 kubenswrapper[29612]: I0319 12:22:15.595095 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:15.619954 master-0 kubenswrapper[29612]: W0319 12:22:15.619904 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc58c9b23_392b_4b14_9b28_89736d899d6d.slice/crio-67a29da384f9c1e83835375c65d2e843097f57eaaac0e4c58c88249ca6c44fd6 WatchSource:0}: Error finding container 67a29da384f9c1e83835375c65d2e843097f57eaaac0e4c58c88249ca6c44fd6: Status 404 returned error can't find the container with id 67a29da384f9c1e83835375c65d2e843097f57eaaac0e4c58c88249ca6c44fd6 Mar 19 12:22:15.658902 master-0 kubenswrapper[29612]: I0319 12:22:15.658852 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4szl\" (UniqueName: \"kubernetes.io/projected/e33e37d7-1911-4149-8382-6b009a1b4bbe-kube-api-access-n4szl\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:15.732030 master-0 kubenswrapper[29612]: I0319 12:22:15.725047 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8jwqz" Mar 19 12:22:15.809210 master-0 kubenswrapper[29612]: I0319 12:22:15.808215 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-677d4b594c-v62pd"] Mar 19 12:22:15.810082 master-0 kubenswrapper[29612]: I0319 12:22:15.809787 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.831785 master-0 kubenswrapper[29612]: I0319 12:22:15.831044 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-677d4b594c-v62pd"] Mar 19 12:22:15.851025 master-0 kubenswrapper[29612]: I0319 12:22:15.850952 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-trusted-ca-bundle\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.851025 master-0 kubenswrapper[29612]: I0319 12:22:15.851030 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-oauth-serving-cert\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.851303 master-0 kubenswrapper[29612]: I0319 12:22:15.851070 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-serving-cert\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.851303 master-0 kubenswrapper[29612]: I0319 12:22:15.851125 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-config\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.851303 master-0 kubenswrapper[29612]: I0319 12:22:15.851167 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-oauth-config\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.851303 master-0 kubenswrapper[29612]: I0319 12:22:15.851212 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-service-ca\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.851303 master-0 kubenswrapper[29612]: I0319 12:22:15.851269 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.851503 master-0 kubenswrapper[29612]: I0319 12:22:15.851314 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpsdz\" (UniqueName: \"kubernetes.io/projected/210a2ebd-8b1c-4db5-af09-46772a8af9de-kube-api-access-bpsdz\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.856129 master-0 kubenswrapper[29612]: I0319 12:22:15.856086 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e94c395a-afd4-4b0e-b796-b09cb0eb6f51-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-6qz5w\" (UID: \"e94c395a-afd4-4b0e-b796-b09cb0eb6f51\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:15.953165 master-0 kubenswrapper[29612]: I0319 12:22:15.953115 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-serving-cert\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.953282 master-0 kubenswrapper[29612]: I0319 12:22:15.953203 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-config\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.953282 master-0 kubenswrapper[29612]: I0319 12:22:15.953249 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-oauth-config\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.953431 master-0 kubenswrapper[29612]: I0319 12:22:15.953290 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-service-ca\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.953543 master-0 kubenswrapper[29612]: I0319 12:22:15.953430 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpsdz\" (UniqueName: \"kubernetes.io/projected/210a2ebd-8b1c-4db5-af09-46772a8af9de-kube-api-access-bpsdz\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.953543 master-0 kubenswrapper[29612]: I0319 12:22:15.953439 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d"] Mar 19 12:22:15.953543 master-0 kubenswrapper[29612]: I0319 12:22:15.953493 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-trusted-ca-bundle\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.957652 master-0 kubenswrapper[29612]: I0319 12:22:15.954467 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-config\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.957652 master-0 kubenswrapper[29612]: I0319 12:22:15.956100 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-service-ca\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.957652 master-0 kubenswrapper[29612]: I0319 12:22:15.956972 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-trusted-ca-bundle\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.957652 master-0 kubenswrapper[29612]: I0319 12:22:15.957035 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-oauth-serving-cert\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.971348 master-0 kubenswrapper[29612]: I0319 12:22:15.958075 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/210a2ebd-8b1c-4db5-af09-46772a8af9de-oauth-serving-cert\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.971348 master-0 kubenswrapper[29612]: W0319 12:22:15.961395 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad53604e_2d6b_4169_9a4f_e4c878098a24.slice/crio-00d3cd13ea0c462b74989c3cfc8167e0b0ca2175ee61dccbc3a8e3b34b50fd58 WatchSource:0}: Error finding container 00d3cd13ea0c462b74989c3cfc8167e0b0ca2175ee61dccbc3a8e3b34b50fd58: Status 404 returned error can't find the container with id 00d3cd13ea0c462b74989c3cfc8167e0b0ca2175ee61dccbc3a8e3b34b50fd58 Mar 19 12:22:15.972740 master-0 kubenswrapper[29612]: I0319 12:22:15.972293 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-oauth-config\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.973245 master-0 kubenswrapper[29612]: I0319 12:22:15.973193 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/210a2ebd-8b1c-4db5-af09-46772a8af9de-console-serving-cert\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:15.977136 master-0 kubenswrapper[29612]: I0319 12:22:15.977066 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpsdz\" (UniqueName: \"kubernetes.io/projected/210a2ebd-8b1c-4db5-af09-46772a8af9de-kube-api-access-bpsdz\") pod \"console-677d4b594c-v62pd\" (UID: \"210a2ebd-8b1c-4db5-af09-46772a8af9de\") " pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:16.059498 master-0 kubenswrapper[29612]: I0319 12:22:16.058720 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e33e37d7-1911-4149-8382-6b009a1b4bbe-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:16.062983 master-0 kubenswrapper[29612]: I0319 12:22:16.062929 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e33e37d7-1911-4149-8382-6b009a1b4bbe-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7lbg6\" (UID: \"e33e37d7-1911-4149-8382-6b009a1b4bbe\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:16.091596 master-0 kubenswrapper[29612]: I0319 12:22:16.091489 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:16.223784 master-0 kubenswrapper[29612]: I0319 12:22:16.223723 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:16.334393 master-0 kubenswrapper[29612]: I0319 12:22:16.334343 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" Mar 19 12:22:16.578893 master-0 kubenswrapper[29612]: I0319 12:22:16.578843 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w"] Mar 19 12:22:16.582653 master-0 kubenswrapper[29612]: I0319 12:22:16.582590 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xjs2f" event={"ID":"c58c9b23-392b-4b14-9b28-89736d899d6d","Type":"ContainerStarted","Data":"67a29da384f9c1e83835375c65d2e843097f57eaaac0e4c58c88249ca6c44fd6"} Mar 19 12:22:16.584032 master-0 kubenswrapper[29612]: I0319 12:22:16.584004 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" event={"ID":"ad53604e-2d6b-4169-9a4f-e4c878098a24","Type":"ContainerStarted","Data":"00d3cd13ea0c462b74989c3cfc8167e0b0ca2175ee61dccbc3a8e3b34b50fd58"} Mar 19 12:22:16.584300 master-0 kubenswrapper[29612]: W0319 12:22:16.584260 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode94c395a_afd4_4b0e_b796_b09cb0eb6f51.slice/crio-bf3737dd4f6f1ba9903a80a8661226e2de6bd62c5cd2f5b52d219f9e47b4fc6f WatchSource:0}: Error finding container bf3737dd4f6f1ba9903a80a8661226e2de6bd62c5cd2f5b52d219f9e47b4fc6f: Status 404 returned error can't find the container with id bf3737dd4f6f1ba9903a80a8661226e2de6bd62c5cd2f5b52d219f9e47b4fc6f Mar 19 12:22:16.586707 master-0 kubenswrapper[29612]: I0319 12:22:16.586679 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8jwqz" event={"ID":"3a2d92a4-7840-44a5-964f-afaf6afa67fb","Type":"ContainerStarted","Data":"446eeff0f4e32644f95b9ab02c82cf07259920a9b984ba88963c08c845b2e012"} Mar 19 12:22:16.586707 master-0 kubenswrapper[29612]: I0319 12:22:16.586705 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8jwqz" event={"ID":"3a2d92a4-7840-44a5-964f-afaf6afa67fb","Type":"ContainerStarted","Data":"37eb2b3884b063e67f9385f778214d786e0cfdd2b22ca490c796de368891efe4"} Mar 19 12:22:16.723688 master-0 kubenswrapper[29612]: I0319 12:22:16.723615 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-677d4b594c-v62pd"] Mar 19 12:22:16.865027 master-0 kubenswrapper[29612]: W0319 12:22:16.864534 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode33e37d7_1911_4149_8382_6b009a1b4bbe.slice/crio-bcc9ae8e8087d9f3021b921e687f3dcff3d7ca6ef95f8956f711f84370d21b32 WatchSource:0}: Error finding container bcc9ae8e8087d9f3021b921e687f3dcff3d7ca6ef95f8956f711f84370d21b32: Status 404 returned error can't find the container with id bcc9ae8e8087d9f3021b921e687f3dcff3d7ca6ef95f8956f711f84370d21b32 Mar 19 12:22:16.869903 master-0 kubenswrapper[29612]: I0319 12:22:16.869844 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6"] Mar 19 12:22:17.601954 master-0 kubenswrapper[29612]: I0319 12:22:17.601895 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-677d4b594c-v62pd" event={"ID":"210a2ebd-8b1c-4db5-af09-46772a8af9de","Type":"ContainerStarted","Data":"9dca3fc58b36316956aa08d9b9d85b8f4491fb52c936cc215d665fb101a4dfe3"} Mar 19 12:22:17.601954 master-0 kubenswrapper[29612]: I0319 12:22:17.601956 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-677d4b594c-v62pd" event={"ID":"210a2ebd-8b1c-4db5-af09-46772a8af9de","Type":"ContainerStarted","Data":"69ac53722d1853b87e171a7060dc29c5139a7694826d6c39483ab22559ec1cd6"} Mar 19 12:22:17.603699 master-0 kubenswrapper[29612]: I0319 12:22:17.603666 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" event={"ID":"e94c395a-afd4-4b0e-b796-b09cb0eb6f51","Type":"ContainerStarted","Data":"bf3737dd4f6f1ba9903a80a8661226e2de6bd62c5cd2f5b52d219f9e47b4fc6f"} Mar 19 12:22:17.604796 master-0 kubenswrapper[29612]: I0319 12:22:17.604737 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" event={"ID":"e33e37d7-1911-4149-8382-6b009a1b4bbe","Type":"ContainerStarted","Data":"bcc9ae8e8087d9f3021b921e687f3dcff3d7ca6ef95f8956f711f84370d21b32"} Mar 19 12:22:17.607433 master-0 kubenswrapper[29612]: I0319 12:22:17.607399 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-ndv4z" event={"ID":"42892640-180b-417d-bae0-e7823b91ec40","Type":"ContainerStarted","Data":"e88454597c691968b1606167a7376f8ffd0e41e2a06df30a5d7bc14cb04a2920"} Mar 19 12:22:17.607512 master-0 kubenswrapper[29612]: I0319 12:22:17.607434 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:17.657992 master-0 kubenswrapper[29612]: I0319 12:22:17.657868 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-677d4b594c-v62pd" podStartSLOduration=2.657813242 podStartE2EDuration="2.657813242s" podCreationTimestamp="2026-03-19 12:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:22:17.651873983 +0000 UTC m=+744.966043044" watchObservedRunningTime="2026-03-19 12:22:17.657813242 +0000 UTC m=+744.971982303" Mar 19 12:22:17.744502 master-0 kubenswrapper[29612]: I0319 12:22:17.744394 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-ndv4z" podStartSLOduration=2.860034636 podStartE2EDuration="4.744337639s" podCreationTimestamp="2026-03-19 12:22:13 +0000 UTC" firstStartedPulling="2026-03-19 12:22:14.783695749 +0000 UTC m=+742.097864770" lastFinishedPulling="2026-03-19 12:22:16.667998752 +0000 UTC m=+743.982167773" observedRunningTime="2026-03-19 12:22:17.741113178 +0000 UTC m=+745.055282199" watchObservedRunningTime="2026-03-19 12:22:17.744337639 +0000 UTC m=+745.058506660" Mar 19 12:22:24.231597 master-0 kubenswrapper[29612]: I0319 12:22:24.231465 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-ndv4z" Mar 19 12:22:24.683558 master-0 kubenswrapper[29612]: I0319 12:22:24.683492 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" event={"ID":"e94c395a-afd4-4b0e-b796-b09cb0eb6f51","Type":"ContainerStarted","Data":"6ea6456fecf909137a0185f23373156f7c816d6f2a32370ea097c9c0da0ed81d"} Mar 19 12:22:24.683828 master-0 kubenswrapper[29612]: I0319 12:22:24.683656 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:24.685320 master-0 kubenswrapper[29612]: I0319 12:22:24.685257 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8jwqz" event={"ID":"3a2d92a4-7840-44a5-964f-afaf6afa67fb","Type":"ContainerStarted","Data":"96fb41cede3f15cd7e1eb0374a0a1d8da11a96e4fa55fcb69c393b2e79c0dcb8"} Mar 19 12:22:24.685625 master-0 kubenswrapper[29612]: I0319 12:22:24.685602 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8jwqz" Mar 19 12:22:24.687566 master-0 kubenswrapper[29612]: I0319 12:22:24.687497 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" event={"ID":"c0c5e02e-a5c3-485f-8beb-eda8e58c0eba","Type":"ContainerStarted","Data":"a692c78a897123c645255debf6b52a0a805c317dc68c8d16dc1aeee9120abdb4"} Mar 19 12:22:24.687720 master-0 kubenswrapper[29612]: I0319 12:22:24.687667 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:24.691774 master-0 kubenswrapper[29612]: I0319 12:22:24.691039 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" event={"ID":"e33e37d7-1911-4149-8382-6b009a1b4bbe","Type":"ContainerStarted","Data":"f8ba77947c87125f77bdd7ed941e837f41dedaaa6483749946e726eb5de2fb54"} Mar 19 12:22:24.692788 master-0 kubenswrapper[29612]: I0319 12:22:24.692746 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-xjs2f" event={"ID":"c58c9b23-392b-4b14-9b28-89736d899d6d","Type":"ContainerStarted","Data":"267a1fdfd2deca1ca840cda9d4db57ccfcc191ac9fc5462859f170694e4853f2"} Mar 19 12:22:24.692928 master-0 kubenswrapper[29612]: I0319 12:22:24.692900 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:24.694590 master-0 kubenswrapper[29612]: I0319 12:22:24.694494 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" event={"ID":"ad53604e-2d6b-4169-9a4f-e4c878098a24","Type":"ContainerStarted","Data":"60f057dc34b434c2edee3558b5507558e9604cc1499d903c36566535ff40a26f"} Mar 19 12:22:24.696081 master-0 kubenswrapper[29612]: I0319 12:22:24.696045 29612 generic.go:334] "Generic (PLEG): container finished" podID="931bbab5-c031-4583-84f0-8b8e236df593" containerID="fe0d12bc08fa72efeb43beda3fc69de962efa9491b53858250ead7a7a0261bed" exitCode=0 Mar 19 12:22:24.696858 master-0 kubenswrapper[29612]: I0319 12:22:24.696818 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" event={"ID":"ad53604e-2d6b-4169-9a4f-e4c878098a24","Type":"ContainerStarted","Data":"2581b20ae7b8720ce9778a2febdd72b5615e6a902f64a6b3c1c67bd72e5ef5f3"} Mar 19 12:22:24.696965 master-0 kubenswrapper[29612]: I0319 12:22:24.696864 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerDied","Data":"fe0d12bc08fa72efeb43beda3fc69de962efa9491b53858250ead7a7a0261bed"} Mar 19 12:22:24.712781 master-0 kubenswrapper[29612]: I0319 12:22:24.712712 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" podStartSLOduration=2.748680992 podStartE2EDuration="9.712691706s" podCreationTimestamp="2026-03-19 12:22:15 +0000 UTC" firstStartedPulling="2026-03-19 12:22:16.585982192 +0000 UTC m=+743.900151213" lastFinishedPulling="2026-03-19 12:22:23.549992896 +0000 UTC m=+750.864161927" observedRunningTime="2026-03-19 12:22:24.704188924 +0000 UTC m=+752.018357965" watchObservedRunningTime="2026-03-19 12:22:24.712691706 +0000 UTC m=+752.026860727" Mar 19 12:22:24.732344 master-0 kubenswrapper[29612]: I0319 12:22:24.732258 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-xjs2f" podStartSLOduration=1.844430373 podStartE2EDuration="9.732239401s" podCreationTimestamp="2026-03-19 12:22:15 +0000 UTC" firstStartedPulling="2026-03-19 12:22:15.621839032 +0000 UTC m=+742.936008053" lastFinishedPulling="2026-03-19 12:22:23.50964806 +0000 UTC m=+750.823817081" observedRunningTime="2026-03-19 12:22:24.724153421 +0000 UTC m=+752.038322452" watchObservedRunningTime="2026-03-19 12:22:24.732239401 +0000 UTC m=+752.046408422" Mar 19 12:22:24.818315 master-0 kubenswrapper[29612]: I0319 12:22:24.810653 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-v9s4d" podStartSLOduration=2.286411735 podStartE2EDuration="9.810606797s" podCreationTimestamp="2026-03-19 12:22:15 +0000 UTC" firstStartedPulling="2026-03-19 12:22:15.963360421 +0000 UTC m=+743.277529442" lastFinishedPulling="2026-03-19 12:22:23.487555483 +0000 UTC m=+750.801724504" observedRunningTime="2026-03-19 12:22:24.777145126 +0000 UTC m=+752.091314157" watchObservedRunningTime="2026-03-19 12:22:24.810606797 +0000 UTC m=+752.124775818" Mar 19 12:22:24.820655 master-0 kubenswrapper[29612]: I0319 12:22:24.818924 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7lbg6" podStartSLOduration=3.177226303 podStartE2EDuration="9.818884492s" podCreationTimestamp="2026-03-19 12:22:15 +0000 UTC" firstStartedPulling="2026-03-19 12:22:16.868575638 +0000 UTC m=+744.182744659" lastFinishedPulling="2026-03-19 12:22:23.510233827 +0000 UTC m=+750.824402848" observedRunningTime="2026-03-19 12:22:24.798669848 +0000 UTC m=+752.112838879" watchObservedRunningTime="2026-03-19 12:22:24.818884492 +0000 UTC m=+752.133053513" Mar 19 12:22:24.855872 master-0 kubenswrapper[29612]: I0319 12:22:24.852594 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8jwqz" podStartSLOduration=5.248611601 podStartE2EDuration="11.852575449s" podCreationTimestamp="2026-03-19 12:22:13 +0000 UTC" firstStartedPulling="2026-03-19 12:22:16.11049367 +0000 UTC m=+743.424662701" lastFinishedPulling="2026-03-19 12:22:22.714457528 +0000 UTC m=+750.028626549" observedRunningTime="2026-03-19 12:22:24.826689783 +0000 UTC m=+752.140858804" watchObservedRunningTime="2026-03-19 12:22:24.852575449 +0000 UTC m=+752.166744470" Mar 19 12:22:24.867057 master-0 kubenswrapper[29612]: I0319 12:22:24.866750 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" podStartSLOduration=2.777765141 podStartE2EDuration="11.866670989s" podCreationTimestamp="2026-03-19 12:22:13 +0000 UTC" firstStartedPulling="2026-03-19 12:22:14.420390792 +0000 UTC m=+741.734559813" lastFinishedPulling="2026-03-19 12:22:23.50929662 +0000 UTC m=+750.823465661" observedRunningTime="2026-03-19 12:22:24.848581895 +0000 UTC m=+752.162750916" watchObservedRunningTime="2026-03-19 12:22:24.866670989 +0000 UTC m=+752.180840010" Mar 19 12:22:25.706818 master-0 kubenswrapper[29612]: I0319 12:22:25.706765 29612 generic.go:334] "Generic (PLEG): container finished" podID="931bbab5-c031-4583-84f0-8b8e236df593" containerID="4f1f0fd8ad7e96a11b30aceff945022dc693202295ba06ef9b6f2cacb454e076" exitCode=0 Mar 19 12:22:25.706818 master-0 kubenswrapper[29612]: I0319 12:22:25.706820 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerDied","Data":"4f1f0fd8ad7e96a11b30aceff945022dc693202295ba06ef9b6f2cacb454e076"} Mar 19 12:22:26.224790 master-0 kubenswrapper[29612]: I0319 12:22:26.224322 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:26.224790 master-0 kubenswrapper[29612]: I0319 12:22:26.224395 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:26.229143 master-0 kubenswrapper[29612]: I0319 12:22:26.229095 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:26.716338 master-0 kubenswrapper[29612]: I0319 12:22:26.716282 29612 generic.go:334] "Generic (PLEG): container finished" podID="931bbab5-c031-4583-84f0-8b8e236df593" containerID="dee8234993f9536462e3830229e996cec9c8fb519353fb640fbbe2f4be5d2b86" exitCode=0 Mar 19 12:22:26.717416 master-0 kubenswrapper[29612]: I0319 12:22:26.717381 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerDied","Data":"dee8234993f9536462e3830229e996cec9c8fb519353fb640fbbe2f4be5d2b86"} Mar 19 12:22:26.727700 master-0 kubenswrapper[29612]: I0319 12:22:26.726574 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-677d4b594c-v62pd" Mar 19 12:22:26.992579 master-0 kubenswrapper[29612]: I0319 12:22:26.991611 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66759f68dd-6zgw8"] Mar 19 12:22:27.726116 master-0 kubenswrapper[29612]: I0319 12:22:27.726065 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"973606bce51dd4442fcbfa3e1539afe047842338167ae0efe146fd2f8223f7a7"} Mar 19 12:22:27.726116 master-0 kubenswrapper[29612]: I0319 12:22:27.726113 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"7b0326b84ff5fbf045eb916e34dc5aae45445b0a271045007a9585f8b7d5c899"} Mar 19 12:22:27.726116 master-0 kubenswrapper[29612]: I0319 12:22:27.726125 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"dc6dcb8b85ff03fc4b15b2bc12ac64ec0ae5e52f06f60c62c07ea77897579f77"} Mar 19 12:22:28.740992 master-0 kubenswrapper[29612]: I0319 12:22:28.740931 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"aed3363d1d85c624e9962af8f332dd2c1d6dc3b2e959c874a47b6261732cca81"} Mar 19 12:22:28.741547 master-0 kubenswrapper[29612]: I0319 12:22:28.741523 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"23aa9a6f727db8ea907ef51b27b9700af6dd111850d1521b28da54f19a5232d6"} Mar 19 12:22:29.765763 master-0 kubenswrapper[29612]: I0319 12:22:29.765624 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-klcpp" event={"ID":"931bbab5-c031-4583-84f0-8b8e236df593","Type":"ContainerStarted","Data":"935699df79ba336003d3918918c8ae983d0c5cf0743136baedd131b1e4326413"} Mar 19 12:22:29.766510 master-0 kubenswrapper[29612]: I0319 12:22:29.765954 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:29.805747 master-0 kubenswrapper[29612]: I0319 12:22:29.805578 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-klcpp" podStartSLOduration=7.503061066 podStartE2EDuration="16.80555298s" podCreationTimestamp="2026-03-19 12:22:13 +0000 UTC" firstStartedPulling="2026-03-19 12:22:14.154362047 +0000 UTC m=+741.468531068" lastFinishedPulling="2026-03-19 12:22:23.456853921 +0000 UTC m=+750.771022982" observedRunningTime="2026-03-19 12:22:29.801204676 +0000 UTC m=+757.115373697" watchObservedRunningTime="2026-03-19 12:22:29.80555298 +0000 UTC m=+757.119722001" Mar 19 12:22:30.624097 master-0 kubenswrapper[29612]: I0319 12:22:30.624039 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-xjs2f" Mar 19 12:22:33.959120 master-0 kubenswrapper[29612]: I0319 12:22:33.959031 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pt6xg" Mar 19 12:22:33.989294 master-0 kubenswrapper[29612]: I0319 12:22:33.989160 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:34.067122 master-0 kubenswrapper[29612]: I0319 12:22:34.064139 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:35.729207 master-0 kubenswrapper[29612]: I0319 12:22:35.729157 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8jwqz" Mar 19 12:22:36.102110 master-0 kubenswrapper[29612]: I0319 12:22:36.101934 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-6qz5w" Mar 19 12:22:42.178975 master-0 kubenswrapper[29612]: I0319 12:22:42.178915 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-kfcf9"] Mar 19 12:22:42.179922 master-0 kubenswrapper[29612]: I0319 12:22:42.179896 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.182038 master-0 kubenswrapper[29612]: I0319 12:22:42.182008 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 19 12:22:42.264656 master-0 kubenswrapper[29612]: I0319 12:22:42.256611 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-kfcf9"] Mar 19 12:22:42.301864 master-0 kubenswrapper[29612]: I0319 12:22:42.301286 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-registration-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.301864 master-0 kubenswrapper[29612]: I0319 12:22:42.301353 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-pod-volumes-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.301864 master-0 kubenswrapper[29612]: I0319 12:22:42.301379 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-file-lock-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.301864 master-0 kubenswrapper[29612]: I0319 12:22:42.301703 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-sys\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.301864 master-0 kubenswrapper[29612]: I0319 12:22:42.301837 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-node-plugin-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.301864 master-0 kubenswrapper[29612]: I0319 12:22:42.301865 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/55655fe5-7372-4a61-b696-9b65c18148e6-metrics-cert\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.302537 master-0 kubenswrapper[29612]: I0319 12:22:42.302079 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-csi-plugin-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.302537 master-0 kubenswrapper[29612]: I0319 12:22:42.302168 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-lvmd-config\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.302537 master-0 kubenswrapper[29612]: I0319 12:22:42.302257 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptz8r\" (UniqueName: \"kubernetes.io/projected/55655fe5-7372-4a61-b696-9b65c18148e6-kube-api-access-ptz8r\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.302537 master-0 kubenswrapper[29612]: I0319 12:22:42.302301 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-run-udev\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.302537 master-0 kubenswrapper[29612]: I0319 12:22:42.302331 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-device-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403601 master-0 kubenswrapper[29612]: I0319 12:22:42.403522 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-sys\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403601 master-0 kubenswrapper[29612]: I0319 12:22:42.403576 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-node-plugin-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403626 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-sys\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403686 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/55655fe5-7372-4a61-b696-9b65c18148e6-metrics-cert\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403741 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-csi-plugin-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403788 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-lvmd-config\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403822 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptz8r\" (UniqueName: \"kubernetes.io/projected/55655fe5-7372-4a61-b696-9b65c18148e6-kube-api-access-ptz8r\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403839 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-run-udev\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403855 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-device-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403892 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-registration-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.403912 master-0 kubenswrapper[29612]: I0319 12:22:42.403915 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-pod-volumes-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.404345 master-0 kubenswrapper[29612]: I0319 12:22:42.403932 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-file-lock-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.404345 master-0 kubenswrapper[29612]: I0319 12:22:42.404248 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-file-lock-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.404345 master-0 kubenswrapper[29612]: I0319 12:22:42.404279 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-run-udev\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.404345 master-0 kubenswrapper[29612]: I0319 12:22:42.404341 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-device-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.404604 master-0 kubenswrapper[29612]: I0319 12:22:42.404390 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-registration-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.404604 master-0 kubenswrapper[29612]: I0319 12:22:42.404420 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-pod-volumes-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.405091 master-0 kubenswrapper[29612]: I0319 12:22:42.405049 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-lvmd-config\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.405227 master-0 kubenswrapper[29612]: I0319 12:22:42.405109 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-node-plugin-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.406955 master-0 kubenswrapper[29612]: I0319 12:22:42.406921 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/55655fe5-7372-4a61-b696-9b65c18148e6-csi-plugin-dir\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.429736 master-0 kubenswrapper[29612]: I0319 12:22:42.429051 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptz8r\" (UniqueName: \"kubernetes.io/projected/55655fe5-7372-4a61-b696-9b65c18148e6-kube-api-access-ptz8r\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.429736 master-0 kubenswrapper[29612]: I0319 12:22:42.429320 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/55655fe5-7372-4a61-b696-9b65c18148e6-metrics-cert\") pod \"vg-manager-kfcf9\" (UID: \"55655fe5-7372-4a61-b696-9b65c18148e6\") " pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:42.497057 master-0 kubenswrapper[29612]: I0319 12:22:42.496936 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:43.392917 master-0 kubenswrapper[29612]: I0319 12:22:43.392856 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-kfcf9"] Mar 19 12:22:43.885739 master-0 kubenswrapper[29612]: I0319 12:22:43.885679 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-kfcf9" event={"ID":"55655fe5-7372-4a61-b696-9b65c18148e6","Type":"ContainerStarted","Data":"f2ac278b020f0963ef1eaca9c44f5e501e1000d7bec13c8744a0c12a0a1811ac"} Mar 19 12:22:43.885739 master-0 kubenswrapper[29612]: I0319 12:22:43.885734 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-kfcf9" event={"ID":"55655fe5-7372-4a61-b696-9b65c18148e6","Type":"ContainerStarted","Data":"081b3b5960f7d03dda51a79acdab3206e1a3383ced5bc90b7e2f4dce6909ddef"} Mar 19 12:22:43.930280 master-0 kubenswrapper[29612]: I0319 12:22:43.930175 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-kfcf9" podStartSLOduration=1.930154809 podStartE2EDuration="1.930154809s" podCreationTimestamp="2026-03-19 12:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:22:43.921995188 +0000 UTC m=+771.236164219" watchObservedRunningTime="2026-03-19 12:22:43.930154809 +0000 UTC m=+771.244323830" Mar 19 12:22:43.991588 master-0 kubenswrapper[29612]: I0319 12:22:43.991533 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-klcpp" Mar 19 12:22:46.917489 master-0 kubenswrapper[29612]: I0319 12:22:46.917421 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-kfcf9_55655fe5-7372-4a61-b696-9b65c18148e6/vg-manager/0.log" Mar 19 12:22:46.918083 master-0 kubenswrapper[29612]: I0319 12:22:46.917506 29612 generic.go:334] "Generic (PLEG): container finished" podID="55655fe5-7372-4a61-b696-9b65c18148e6" containerID="f2ac278b020f0963ef1eaca9c44f5e501e1000d7bec13c8744a0c12a0a1811ac" exitCode=1 Mar 19 12:22:46.918381 master-0 kubenswrapper[29612]: I0319 12:22:46.918331 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-kfcf9" event={"ID":"55655fe5-7372-4a61-b696-9b65c18148e6","Type":"ContainerDied","Data":"f2ac278b020f0963ef1eaca9c44f5e501e1000d7bec13c8744a0c12a0a1811ac"} Mar 19 12:22:46.919088 master-0 kubenswrapper[29612]: I0319 12:22:46.919060 29612 scope.go:117] "RemoveContainer" containerID="f2ac278b020f0963ef1eaca9c44f5e501e1000d7bec13c8744a0c12a0a1811ac" Mar 19 12:22:47.267559 master-0 kubenswrapper[29612]: I0319 12:22:47.267413 29612 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 19 12:22:47.320311 master-0 kubenswrapper[29612]: I0319 12:22:47.320172 29612 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-19T12:22:47.267476418Z","Handler":null,"Name":""} Mar 19 12:22:47.325319 master-0 kubenswrapper[29612]: I0319 12:22:47.325272 29612 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 19 12:22:47.325319 master-0 kubenswrapper[29612]: I0319 12:22:47.325325 29612 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 19 12:22:47.946664 master-0 kubenswrapper[29612]: I0319 12:22:47.942347 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-kfcf9_55655fe5-7372-4a61-b696-9b65c18148e6/vg-manager/0.log" Mar 19 12:22:47.946664 master-0 kubenswrapper[29612]: I0319 12:22:47.942417 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-kfcf9" event={"ID":"55655fe5-7372-4a61-b696-9b65c18148e6","Type":"ContainerStarted","Data":"5482d79ed1208f51df2286638453130ae8d37fa9931f27106c433629c99c1294"} Mar 19 12:22:49.929737 master-0 kubenswrapper[29612]: I0319 12:22:49.929668 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mlqm8"] Mar 19 12:22:49.931127 master-0 kubenswrapper[29612]: I0319 12:22:49.931088 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:22:49.933361 master-0 kubenswrapper[29612]: I0319 12:22:49.933328 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 12:22:49.933936 master-0 kubenswrapper[29612]: I0319 12:22:49.933903 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 12:22:49.940205 master-0 kubenswrapper[29612]: I0319 12:22:49.940146 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mlqm8"] Mar 19 12:22:49.994264 master-0 kubenswrapper[29612]: I0319 12:22:49.993661 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lprw\" (UniqueName: \"kubernetes.io/projected/d325beaa-9017-46c5-a52a-84ecf54e1c0d-kube-api-access-5lprw\") pod \"openstack-operator-index-mlqm8\" (UID: \"d325beaa-9017-46c5-a52a-84ecf54e1c0d\") " pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:22:50.095097 master-0 kubenswrapper[29612]: I0319 12:22:50.095047 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lprw\" (UniqueName: \"kubernetes.io/projected/d325beaa-9017-46c5-a52a-84ecf54e1c0d-kube-api-access-5lprw\") pod \"openstack-operator-index-mlqm8\" (UID: \"d325beaa-9017-46c5-a52a-84ecf54e1c0d\") " pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:22:50.118208 master-0 kubenswrapper[29612]: I0319 12:22:50.118171 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lprw\" (UniqueName: \"kubernetes.io/projected/d325beaa-9017-46c5-a52a-84ecf54e1c0d-kube-api-access-5lprw\") pod \"openstack-operator-index-mlqm8\" (UID: \"d325beaa-9017-46c5-a52a-84ecf54e1c0d\") " pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:22:50.249993 master-0 kubenswrapper[29612]: I0319 12:22:50.249881 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:22:51.229400 master-0 kubenswrapper[29612]: I0319 12:22:51.228548 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mlqm8"] Mar 19 12:22:51.987707 master-0 kubenswrapper[29612]: I0319 12:22:51.987500 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mlqm8" event={"ID":"d325beaa-9017-46c5-a52a-84ecf54e1c0d","Type":"ContainerStarted","Data":"dcf8bd064ed411a03ff3722308eb0430eac134c499514db5e6747c7d771b80d2"} Mar 19 12:22:52.037393 master-0 kubenswrapper[29612]: I0319 12:22:52.037240 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-66759f68dd-6zgw8" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" containerID="cri-o://bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78" gracePeriod=15 Mar 19 12:22:52.502814 master-0 kubenswrapper[29612]: I0319 12:22:52.497887 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:52.505346 master-0 kubenswrapper[29612]: I0319 12:22:52.503072 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:52.654787 master-0 kubenswrapper[29612]: I0319 12:22:52.654619 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66759f68dd-6zgw8_50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf/console/0.log" Mar 19 12:22:52.655181 master-0 kubenswrapper[29612]: I0319 12:22:52.655162 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:22:52.742702 master-0 kubenswrapper[29612]: I0319 12:22:52.742612 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg7mt\" (UniqueName: \"kubernetes.io/projected/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-kube-api-access-vg7mt\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.743007 master-0 kubenswrapper[29612]: I0319 12:22:52.742776 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-service-ca\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.743007 master-0 kubenswrapper[29612]: I0319 12:22:52.742878 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-oauth-serving-cert\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.743390 master-0 kubenswrapper[29612]: I0319 12:22:52.743326 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-service-ca" (OuterVolumeSpecName: "service-ca") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:22:52.743451 master-0 kubenswrapper[29612]: I0319 12:22:52.743397 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:22:52.743793 master-0 kubenswrapper[29612]: I0319 12:22:52.743763 29612 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.743793 master-0 kubenswrapper[29612]: I0319 12:22:52.743785 29612 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.745614 master-0 kubenswrapper[29612]: I0319 12:22:52.745541 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-kube-api-access-vg7mt" (OuterVolumeSpecName: "kube-api-access-vg7mt") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "kube-api-access-vg7mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:22:52.844307 master-0 kubenswrapper[29612]: I0319 12:22:52.844246 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-config\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.844307 master-0 kubenswrapper[29612]: I0319 12:22:52.844315 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-oauth-config\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.844613 master-0 kubenswrapper[29612]: I0319 12:22:52.844357 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-serving-cert\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.844613 master-0 kubenswrapper[29612]: I0319 12:22:52.844381 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-trusted-ca-bundle\") pod \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\" (UID: \"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf\") " Mar 19 12:22:52.844718 master-0 kubenswrapper[29612]: I0319 12:22:52.844628 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg7mt\" (UniqueName: \"kubernetes.io/projected/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-kube-api-access-vg7mt\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.845100 master-0 kubenswrapper[29612]: I0319 12:22:52.845055 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:22:52.845461 master-0 kubenswrapper[29612]: I0319 12:22:52.845422 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-config" (OuterVolumeSpecName: "console-config") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:22:52.847752 master-0 kubenswrapper[29612]: I0319 12:22:52.847676 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:22:52.848547 master-0 kubenswrapper[29612]: I0319 12:22:52.848465 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" (UID: "50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:22:52.946371 master-0 kubenswrapper[29612]: I0319 12:22:52.946292 29612 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.946371 master-0 kubenswrapper[29612]: I0319 12:22:52.946349 29612 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.946371 master-0 kubenswrapper[29612]: I0319 12:22:52.946367 29612 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.946371 master-0 kubenswrapper[29612]: I0319 12:22:52.946380 29612 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:22:52.996992 master-0 kubenswrapper[29612]: I0319 12:22:52.996940 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66759f68dd-6zgw8_50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf/console/0.log" Mar 19 12:22:52.997223 master-0 kubenswrapper[29612]: I0319 12:22:52.997017 29612 generic.go:334] "Generic (PLEG): container finished" podID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerID="bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78" exitCode=2 Mar 19 12:22:52.997223 master-0 kubenswrapper[29612]: I0319 12:22:52.997095 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66759f68dd-6zgw8" Mar 19 12:22:52.997788 master-0 kubenswrapper[29612]: I0319 12:22:52.997097 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66759f68dd-6zgw8" event={"ID":"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf","Type":"ContainerDied","Data":"bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78"} Mar 19 12:22:52.997872 master-0 kubenswrapper[29612]: I0319 12:22:52.997832 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66759f68dd-6zgw8" event={"ID":"50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf","Type":"ContainerDied","Data":"b6ba7390d02fd46b2c6cf5b9563e7080f84f40575dd156c3fc6fc240152e723f"} Mar 19 12:22:52.997924 master-0 kubenswrapper[29612]: I0319 12:22:52.997889 29612 scope.go:117] "RemoveContainer" containerID="bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78" Mar 19 12:22:53.000803 master-0 kubenswrapper[29612]: I0319 12:22:53.000757 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mlqm8" event={"ID":"d325beaa-9017-46c5-a52a-84ecf54e1c0d","Type":"ContainerStarted","Data":"5ffe8f1d04707ff09eaa9600bbb50b56cfd7448b1b5ba67e65688ce69de171f3"} Mar 19 12:22:53.001020 master-0 kubenswrapper[29612]: I0319 12:22:53.000983 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:53.001859 master-0 kubenswrapper[29612]: I0319 12:22:53.001820 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-kfcf9" Mar 19 12:22:53.021268 master-0 kubenswrapper[29612]: I0319 12:22:53.021142 29612 scope.go:117] "RemoveContainer" containerID="bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78" Mar 19 12:22:53.021495 master-0 kubenswrapper[29612]: E0319 12:22:53.021450 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78\": container with ID starting with bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78 not found: ID does not exist" containerID="bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78" Mar 19 12:22:53.021556 master-0 kubenswrapper[29612]: I0319 12:22:53.021497 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78"} err="failed to get container status \"bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78\": rpc error: code = NotFound desc = could not find container \"bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78\": container with ID starting with bfd5b2a043eb7b73d61bf37f6781bb138c076f72318a0f131d2489c1dfda1b78 not found: ID does not exist" Mar 19 12:22:53.057347 master-0 kubenswrapper[29612]: I0319 12:22:53.057269 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66759f68dd-6zgw8"] Mar 19 12:22:53.073278 master-0 kubenswrapper[29612]: I0319 12:22:53.073094 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66759f68dd-6zgw8"] Mar 19 12:22:53.698372 master-0 kubenswrapper[29612]: I0319 12:22:53.698250 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mlqm8" podStartSLOduration=3.6960010150000002 podStartE2EDuration="4.698220156s" podCreationTimestamp="2026-03-19 12:22:49 +0000 UTC" firstStartedPulling="2026-03-19 12:22:51.244143443 +0000 UTC m=+778.558312464" lastFinishedPulling="2026-03-19 12:22:52.246362584 +0000 UTC m=+779.560531605" observedRunningTime="2026-03-19 12:22:53.690165598 +0000 UTC m=+781.004334679" watchObservedRunningTime="2026-03-19 12:22:53.698220156 +0000 UTC m=+781.012389217" Mar 19 12:22:54.921537 master-0 kubenswrapper[29612]: I0319 12:22:54.921452 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" path="/var/lib/kubelet/pods/50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf/volumes" Mar 19 12:23:00.251450 master-0 kubenswrapper[29612]: I0319 12:23:00.251361 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:23:00.251450 master-0 kubenswrapper[29612]: I0319 12:23:00.251461 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:23:00.282546 master-0 kubenswrapper[29612]: I0319 12:23:00.282477 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:23:01.121564 master-0 kubenswrapper[29612]: I0319 12:23:01.121489 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mlqm8" Mar 19 12:23:09.302518 master-0 kubenswrapper[29612]: I0319 12:23:09.302444 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z"] Mar 19 12:23:09.303293 master-0 kubenswrapper[29612]: E0319 12:23:09.303171 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" Mar 19 12:23:09.303293 master-0 kubenswrapper[29612]: I0319 12:23:09.303199 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" Mar 19 12:23:09.303624 master-0 kubenswrapper[29612]: I0319 12:23:09.303591 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bfdf1d-36b9-47a9-a65f-1a5bc1a89dcf" containerName="console" Mar 19 12:23:09.306295 master-0 kubenswrapper[29612]: I0319 12:23:09.306236 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.316462 master-0 kubenswrapper[29612]: I0319 12:23:09.316399 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z"] Mar 19 12:23:09.320772 master-0 kubenswrapper[29612]: I0319 12:23:09.320706 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vcnk\" (UniqueName: \"kubernetes.io/projected/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-kube-api-access-6vcnk\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.320906 master-0 kubenswrapper[29612]: I0319 12:23:09.320775 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.320906 master-0 kubenswrapper[29612]: I0319 12:23:09.320849 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.422083 master-0 kubenswrapper[29612]: I0319 12:23:09.422023 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.422321 master-0 kubenswrapper[29612]: I0319 12:23:09.422165 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vcnk\" (UniqueName: \"kubernetes.io/projected/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-kube-api-access-6vcnk\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.422419 master-0 kubenswrapper[29612]: I0319 12:23:09.422361 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.422651 master-0 kubenswrapper[29612]: I0319 12:23:09.422581 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-bundle\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.422870 master-0 kubenswrapper[29612]: I0319 12:23:09.422838 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-util\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.439834 master-0 kubenswrapper[29612]: I0319 12:23:09.439763 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vcnk\" (UniqueName: \"kubernetes.io/projected/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-kube-api-access-6vcnk\") pod \"7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:09.624128 master-0 kubenswrapper[29612]: I0319 12:23:09.623968 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:10.136158 master-0 kubenswrapper[29612]: I0319 12:23:10.136103 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z"] Mar 19 12:23:10.139927 master-0 kubenswrapper[29612]: W0319 12:23:10.139880 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa0dcba_00b3_44d9_8188_1f3eead4f1e9.slice/crio-734900f601c45408e67c224a253539820276305e05f95d766b3578c6c572910e WatchSource:0}: Error finding container 734900f601c45408e67c224a253539820276305e05f95d766b3578c6c572910e: Status 404 returned error can't find the container with id 734900f601c45408e67c224a253539820276305e05f95d766b3578c6c572910e Mar 19 12:23:10.179876 master-0 kubenswrapper[29612]: I0319 12:23:10.179769 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" event={"ID":"faa0dcba-00b3-44d9-8188-1f3eead4f1e9","Type":"ContainerStarted","Data":"734900f601c45408e67c224a253539820276305e05f95d766b3578c6c572910e"} Mar 19 12:23:11.196563 master-0 kubenswrapper[29612]: I0319 12:23:11.196454 29612 generic.go:334] "Generic (PLEG): container finished" podID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerID="3ef4f91d2b586df269ad2529aab454bd4569f95de0f6fa8f3f445a4f506c599d" exitCode=0 Mar 19 12:23:11.197710 master-0 kubenswrapper[29612]: I0319 12:23:11.196561 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" event={"ID":"faa0dcba-00b3-44d9-8188-1f3eead4f1e9","Type":"ContainerDied","Data":"3ef4f91d2b586df269ad2529aab454bd4569f95de0f6fa8f3f445a4f506c599d"} Mar 19 12:23:11.198886 master-0 kubenswrapper[29612]: I0319 12:23:11.198554 29612 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:23:12.208618 master-0 kubenswrapper[29612]: I0319 12:23:12.208466 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" event={"ID":"faa0dcba-00b3-44d9-8188-1f3eead4f1e9","Type":"ContainerStarted","Data":"81a082bda6c4cac8dc8dde610761f7ff6590e24ac9f6bec773920a386b009285"} Mar 19 12:23:13.225168 master-0 kubenswrapper[29612]: I0319 12:23:13.225057 29612 generic.go:334] "Generic (PLEG): container finished" podID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerID="81a082bda6c4cac8dc8dde610761f7ff6590e24ac9f6bec773920a386b009285" exitCode=0 Mar 19 12:23:13.225168 master-0 kubenswrapper[29612]: I0319 12:23:13.225151 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" event={"ID":"faa0dcba-00b3-44d9-8188-1f3eead4f1e9","Type":"ContainerDied","Data":"81a082bda6c4cac8dc8dde610761f7ff6590e24ac9f6bec773920a386b009285"} Mar 19 12:23:14.239057 master-0 kubenswrapper[29612]: I0319 12:23:14.238989 29612 generic.go:334] "Generic (PLEG): container finished" podID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerID="a979d59c31584f3740e5df7d0e649703af6da75a5700b6226423135bf6b7b90a" exitCode=0 Mar 19 12:23:14.239057 master-0 kubenswrapper[29612]: I0319 12:23:14.239060 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" event={"ID":"faa0dcba-00b3-44d9-8188-1f3eead4f1e9","Type":"ContainerDied","Data":"a979d59c31584f3740e5df7d0e649703af6da75a5700b6226423135bf6b7b90a"} Mar 19 12:23:15.625377 master-0 kubenswrapper[29612]: I0319 12:23:15.625312 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:15.740156 master-0 kubenswrapper[29612]: I0319 12:23:15.740073 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-util\") pod \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " Mar 19 12:23:15.740507 master-0 kubenswrapper[29612]: I0319 12:23:15.740203 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vcnk\" (UniqueName: \"kubernetes.io/projected/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-kube-api-access-6vcnk\") pod \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " Mar 19 12:23:15.740507 master-0 kubenswrapper[29612]: I0319 12:23:15.740280 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-bundle\") pod \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\" (UID: \"faa0dcba-00b3-44d9-8188-1f3eead4f1e9\") " Mar 19 12:23:15.741449 master-0 kubenswrapper[29612]: I0319 12:23:15.741393 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-bundle" (OuterVolumeSpecName: "bundle") pod "faa0dcba-00b3-44d9-8188-1f3eead4f1e9" (UID: "faa0dcba-00b3-44d9-8188-1f3eead4f1e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:23:15.744712 master-0 kubenswrapper[29612]: I0319 12:23:15.744650 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-kube-api-access-6vcnk" (OuterVolumeSpecName: "kube-api-access-6vcnk") pod "faa0dcba-00b3-44d9-8188-1f3eead4f1e9" (UID: "faa0dcba-00b3-44d9-8188-1f3eead4f1e9"). InnerVolumeSpecName "kube-api-access-6vcnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:23:15.756284 master-0 kubenswrapper[29612]: I0319 12:23:15.756163 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-util" (OuterVolumeSpecName: "util") pod "faa0dcba-00b3-44d9-8188-1f3eead4f1e9" (UID: "faa0dcba-00b3-44d9-8188-1f3eead4f1e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:23:15.842890 master-0 kubenswrapper[29612]: I0319 12:23:15.842495 29612 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-util\") on node \"master-0\" DevicePath \"\"" Mar 19 12:23:15.842890 master-0 kubenswrapper[29612]: I0319 12:23:15.842538 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vcnk\" (UniqueName: \"kubernetes.io/projected/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-kube-api-access-6vcnk\") on node \"master-0\" DevicePath \"\"" Mar 19 12:23:15.842890 master-0 kubenswrapper[29612]: I0319 12:23:15.842550 29612 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/faa0dcba-00b3-44d9-8188-1f3eead4f1e9-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:23:16.258882 master-0 kubenswrapper[29612]: I0319 12:23:16.258755 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" event={"ID":"faa0dcba-00b3-44d9-8188-1f3eead4f1e9","Type":"ContainerDied","Data":"734900f601c45408e67c224a253539820276305e05f95d766b3578c6c572910e"} Mar 19 12:23:16.259142 master-0 kubenswrapper[29612]: I0319 12:23:16.258882 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="734900f601c45408e67c224a253539820276305e05f95d766b3578c6c572910e" Mar 19 12:23:16.259142 master-0 kubenswrapper[29612]: I0319 12:23:16.258837 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7c80869988bfa7821a7e3d4d9e7801b12993e99d05df1815488a38514c4tk5z" Mar 19 12:23:21.637937 master-0 kubenswrapper[29612]: I0319 12:23:21.637880 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn"] Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: E0319 12:23:21.638253 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="util" Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: I0319 12:23:21.638268 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="util" Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: E0319 12:23:21.638307 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="pull" Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: I0319 12:23:21.638313 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="pull" Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: E0319 12:23:21.638324 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="extract" Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: I0319 12:23:21.638331 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="extract" Mar 19 12:23:21.638601 master-0 kubenswrapper[29612]: I0319 12:23:21.638585 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa0dcba-00b3-44d9-8188-1f3eead4f1e9" containerName="extract" Mar 19 12:23:21.639182 master-0 kubenswrapper[29612]: I0319 12:23:21.639152 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:23:21.667543 master-0 kubenswrapper[29612]: I0319 12:23:21.667483 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn"] Mar 19 12:23:21.766972 master-0 kubenswrapper[29612]: I0319 12:23:21.766379 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbv7l\" (UniqueName: \"kubernetes.io/projected/25780c84-04af-4a73-bc66-21e7284dc307-kube-api-access-dbv7l\") pod \"openstack-operator-controller-init-b85c4d696-7k9pn\" (UID: \"25780c84-04af-4a73-bc66-21e7284dc307\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:23:21.868129 master-0 kubenswrapper[29612]: I0319 12:23:21.868077 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbv7l\" (UniqueName: \"kubernetes.io/projected/25780c84-04af-4a73-bc66-21e7284dc307-kube-api-access-dbv7l\") pod \"openstack-operator-controller-init-b85c4d696-7k9pn\" (UID: \"25780c84-04af-4a73-bc66-21e7284dc307\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:23:21.885534 master-0 kubenswrapper[29612]: I0319 12:23:21.885491 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbv7l\" (UniqueName: \"kubernetes.io/projected/25780c84-04af-4a73-bc66-21e7284dc307-kube-api-access-dbv7l\") pod \"openstack-operator-controller-init-b85c4d696-7k9pn\" (UID: \"25780c84-04af-4a73-bc66-21e7284dc307\") " pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:23:21.964983 master-0 kubenswrapper[29612]: I0319 12:23:21.964931 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:23:22.584843 master-0 kubenswrapper[29612]: I0319 12:23:22.582707 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn"] Mar 19 12:23:23.327600 master-0 kubenswrapper[29612]: I0319 12:23:23.327543 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" event={"ID":"25780c84-04af-4a73-bc66-21e7284dc307","Type":"ContainerStarted","Data":"780a525893ed8a61c5a4ea6a841de499bf4fcac2192cf816a84aa69ac7cbe4e1"} Mar 19 12:23:28.374754 master-0 kubenswrapper[29612]: I0319 12:23:28.374667 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" event={"ID":"25780c84-04af-4a73-bc66-21e7284dc307","Type":"ContainerStarted","Data":"b9d5c8f14150d737ea72938367a963c9f43713eea536b969aa8da45b3ba1648b"} Mar 19 12:23:28.375727 master-0 kubenswrapper[29612]: I0319 12:23:28.374929 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:23:28.430478 master-0 kubenswrapper[29612]: I0319 12:23:28.430379 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" podStartSLOduration=2.419807033 podStartE2EDuration="7.430355026s" podCreationTimestamp="2026-03-19 12:23:21 +0000 UTC" firstStartedPulling="2026-03-19 12:23:22.594717737 +0000 UTC m=+809.908886758" lastFinishedPulling="2026-03-19 12:23:27.60526572 +0000 UTC m=+814.919434751" observedRunningTime="2026-03-19 12:23:28.426606079 +0000 UTC m=+815.740775150" watchObservedRunningTime="2026-03-19 12:23:28.430355026 +0000 UTC m=+815.744524057" Mar 19 12:23:41.969415 master-0 kubenswrapper[29612]: I0319 12:23:41.969343 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b85c4d696-7k9pn" Mar 19 12:24:04.357858 master-0 kubenswrapper[29612]: I0319 12:24:04.353709 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq"] Mar 19 12:24:04.357858 master-0 kubenswrapper[29612]: I0319 12:24:04.355274 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:04.363743 master-0 kubenswrapper[29612]: I0319 12:24:04.363315 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx"] Mar 19 12:24:04.367934 master-0 kubenswrapper[29612]: I0319 12:24:04.364623 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:04.405307 master-0 kubenswrapper[29612]: I0319 12:24:04.400078 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq"] Mar 19 12:24:04.405664 master-0 kubenswrapper[29612]: I0319 12:24:04.405250 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8"] Mar 19 12:24:04.406943 master-0 kubenswrapper[29612]: I0319 12:24:04.406894 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:04.421395 master-0 kubenswrapper[29612]: I0319 12:24:04.419463 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx"] Mar 19 12:24:04.467678 master-0 kubenswrapper[29612]: I0319 12:24:04.459269 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn"] Mar 19 12:24:04.467678 master-0 kubenswrapper[29612]: I0319 12:24:04.460316 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:04.467678 master-0 kubenswrapper[29612]: I0319 12:24:04.462098 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9g7\" (UniqueName: \"kubernetes.io/projected/fa2bb2e9-5b88-4c4d-8abe-2837c40e5880-kube-api-access-ms9g7\") pod \"barbican-operator-controller-manager-59bc569d95-8sdwq\" (UID: \"fa2bb2e9-5b88-4c4d-8abe-2837c40e5880\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:04.467678 master-0 kubenswrapper[29612]: I0319 12:24:04.462283 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7shsb\" (UniqueName: \"kubernetes.io/projected/f77e9b9d-e1fe-411c-a005-be21205ebc29-kube-api-access-7shsb\") pod \"cinder-operator-controller-manager-8d58dc466-q4ldx\" (UID: \"f77e9b9d-e1fe-411c-a005-be21205ebc29\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:04.476392 master-0 kubenswrapper[29612]: I0319 12:24:04.476343 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8"] Mar 19 12:24:04.521582 master-0 kubenswrapper[29612]: I0319 12:24:04.518715 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn"] Mar 19 12:24:04.566370 master-0 kubenswrapper[29612]: I0319 12:24:04.563882 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52h48\" (UniqueName: \"kubernetes.io/projected/e6648cf7-2ce1-4994-9edd-037386b885e7-kube-api-access-52h48\") pod \"designate-operator-controller-manager-588d4d986b-p8kq8\" (UID: \"e6648cf7-2ce1-4994-9edd-037386b885e7\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:04.566370 master-0 kubenswrapper[29612]: I0319 12:24:04.563943 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9g7\" (UniqueName: \"kubernetes.io/projected/fa2bb2e9-5b88-4c4d-8abe-2837c40e5880-kube-api-access-ms9g7\") pod \"barbican-operator-controller-manager-59bc569d95-8sdwq\" (UID: \"fa2bb2e9-5b88-4c4d-8abe-2837c40e5880\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:04.566370 master-0 kubenswrapper[29612]: I0319 12:24:04.563999 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8nth\" (UniqueName: \"kubernetes.io/projected/35b6c38e-a5ea-47ad-ac78-dcb138979d96-kube-api-access-h8nth\") pod \"glance-operator-controller-manager-79df6bcc97-b2rhn\" (UID: \"35b6c38e-a5ea-47ad-ac78-dcb138979d96\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:04.566370 master-0 kubenswrapper[29612]: I0319 12:24:04.564034 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7shsb\" (UniqueName: \"kubernetes.io/projected/f77e9b9d-e1fe-411c-a005-be21205ebc29-kube-api-access-7shsb\") pod \"cinder-operator-controller-manager-8d58dc466-q4ldx\" (UID: \"f77e9b9d-e1fe-411c-a005-be21205ebc29\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:04.575747 master-0 kubenswrapper[29612]: I0319 12:24:04.575602 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd"] Mar 19 12:24:04.577552 master-0 kubenswrapper[29612]: I0319 12:24:04.577024 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:04.602780 master-0 kubenswrapper[29612]: I0319 12:24:04.602719 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd"] Mar 19 12:24:04.610976 master-0 kubenswrapper[29612]: I0319 12:24:04.610895 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7shsb\" (UniqueName: \"kubernetes.io/projected/f77e9b9d-e1fe-411c-a005-be21205ebc29-kube-api-access-7shsb\") pod \"cinder-operator-controller-manager-8d58dc466-q4ldx\" (UID: \"f77e9b9d-e1fe-411c-a005-be21205ebc29\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:04.613366 master-0 kubenswrapper[29612]: I0319 12:24:04.613342 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9g7\" (UniqueName: \"kubernetes.io/projected/fa2bb2e9-5b88-4c4d-8abe-2837c40e5880-kube-api-access-ms9g7\") pod \"barbican-operator-controller-manager-59bc569d95-8sdwq\" (UID: \"fa2bb2e9-5b88-4c4d-8abe-2837c40e5880\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:04.622686 master-0 kubenswrapper[29612]: I0319 12:24:04.622454 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29"] Mar 19 12:24:04.623993 master-0 kubenswrapper[29612]: I0319 12:24:04.623960 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:04.632333 master-0 kubenswrapper[29612]: I0319 12:24:04.632282 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29"] Mar 19 12:24:04.641773 master-0 kubenswrapper[29612]: I0319 12:24:04.641716 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq"] Mar 19 12:24:04.642968 master-0 kubenswrapper[29612]: I0319 12:24:04.642940 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:04.646808 master-0 kubenswrapper[29612]: I0319 12:24:04.646772 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 12:24:04.665005 master-0 kubenswrapper[29612]: I0319 12:24:04.663135 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq"] Mar 19 12:24:04.680675 master-0 kubenswrapper[29612]: I0319 12:24:04.679613 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52h48\" (UniqueName: \"kubernetes.io/projected/e6648cf7-2ce1-4994-9edd-037386b885e7-kube-api-access-52h48\") pod \"designate-operator-controller-manager-588d4d986b-p8kq8\" (UID: \"e6648cf7-2ce1-4994-9edd-037386b885e7\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:04.686738 master-0 kubenswrapper[29612]: I0319 12:24:04.686135 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8nth\" (UniqueName: \"kubernetes.io/projected/35b6c38e-a5ea-47ad-ac78-dcb138979d96-kube-api-access-h8nth\") pod \"glance-operator-controller-manager-79df6bcc97-b2rhn\" (UID: \"35b6c38e-a5ea-47ad-ac78-dcb138979d96\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:04.686738 master-0 kubenswrapper[29612]: I0319 12:24:04.686356 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfqz\" (UniqueName: \"kubernetes.io/projected/903e6a5b-d8ec-4a23-bc47-8428e998422f-kube-api-access-pnfqz\") pod \"heat-operator-controller-manager-67dd5f86f5-txwhd\" (UID: \"903e6a5b-d8ec-4a23-bc47-8428e998422f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:04.702152 master-0 kubenswrapper[29612]: I0319 12:24:04.702089 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv"] Mar 19 12:24:04.703535 master-0 kubenswrapper[29612]: I0319 12:24:04.703252 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:04.709239 master-0 kubenswrapper[29612]: I0319 12:24:04.709171 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv"] Mar 19 12:24:04.718012 master-0 kubenswrapper[29612]: I0319 12:24:04.717373 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs"] Mar 19 12:24:04.718680 master-0 kubenswrapper[29612]: I0319 12:24:04.718660 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:04.739836 master-0 kubenswrapper[29612]: I0319 12:24:04.739279 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:04.777771 master-0 kubenswrapper[29612]: I0319 12:24:04.777702 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs"] Mar 19 12:24:04.778315 master-0 kubenswrapper[29612]: I0319 12:24:04.778271 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:04.788746 master-0 kubenswrapper[29612]: I0319 12:24:04.788701 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrbz\" (UniqueName: \"kubernetes.io/projected/0f3d31e4-66ef-4e51-a1d3-c2a3528f533c-kube-api-access-rlrbz\") pod \"horizon-operator-controller-manager-8464cc45fb-d6r29\" (UID: \"0f3d31e4-66ef-4e51-a1d3-c2a3528f533c\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:04.789044 master-0 kubenswrapper[29612]: I0319 12:24:04.789020 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfqz\" (UniqueName: \"kubernetes.io/projected/903e6a5b-d8ec-4a23-bc47-8428e998422f-kube-api-access-pnfqz\") pod \"heat-operator-controller-manager-67dd5f86f5-txwhd\" (UID: \"903e6a5b-d8ec-4a23-bc47-8428e998422f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:04.789505 master-0 kubenswrapper[29612]: I0319 12:24:04.789149 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:04.789856 master-0 kubenswrapper[29612]: I0319 12:24:04.789835 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zht66\" (UniqueName: \"kubernetes.io/projected/7398e347-55cd-481c-9240-27e7e6a8e8ba-kube-api-access-zht66\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:04.789965 master-0 kubenswrapper[29612]: I0319 12:24:04.789929 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52h48\" (UniqueName: \"kubernetes.io/projected/e6648cf7-2ce1-4994-9edd-037386b885e7-kube-api-access-52h48\") pod \"designate-operator-controller-manager-588d4d986b-p8kq8\" (UID: \"e6648cf7-2ce1-4994-9edd-037386b885e7\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:04.796577 master-0 kubenswrapper[29612]: I0319 12:24:04.795221 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:04.809271 master-0 kubenswrapper[29612]: I0319 12:24:04.809224 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8nth\" (UniqueName: \"kubernetes.io/projected/35b6c38e-a5ea-47ad-ac78-dcb138979d96-kube-api-access-h8nth\") pod \"glance-operator-controller-manager-79df6bcc97-b2rhn\" (UID: \"35b6c38e-a5ea-47ad-ac78-dcb138979d96\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:04.820694 master-0 kubenswrapper[29612]: I0319 12:24:04.820624 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfqz\" (UniqueName: \"kubernetes.io/projected/903e6a5b-d8ec-4a23-bc47-8428e998422f-kube-api-access-pnfqz\") pod \"heat-operator-controller-manager-67dd5f86f5-txwhd\" (UID: \"903e6a5b-d8ec-4a23-bc47-8428e998422f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:04.833452 master-0 kubenswrapper[29612]: I0319 12:24:04.830876 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-6chjf"] Mar 19 12:24:04.833452 master-0 kubenswrapper[29612]: I0319 12:24:04.832034 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:04.845338 master-0 kubenswrapper[29612]: I0319 12:24:04.845297 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: I0319 12:24:04.892739 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk6pw\" (UniqueName: \"kubernetes.io/projected/8b552910-e16b-4d0f-a672-e102464441ee-kube-api-access-mk6pw\") pod \"keystone-operator-controller-manager-768b96df4c-8b6qs\" (UID: \"8b552910-e16b-4d0f-a672-e102464441ee\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: I0319 12:24:04.892805 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zht66\" (UniqueName: \"kubernetes.io/projected/7398e347-55cd-481c-9240-27e7e6a8e8ba-kube-api-access-zht66\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: I0319 12:24:04.892836 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrbz\" (UniqueName: \"kubernetes.io/projected/0f3d31e4-66ef-4e51-a1d3-c2a3528f533c-kube-api-access-rlrbz\") pod \"horizon-operator-controller-manager-8464cc45fb-d6r29\" (UID: \"0f3d31e4-66ef-4e51-a1d3-c2a3528f533c\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: I0319 12:24:04.892856 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4flvb\" (UniqueName: \"kubernetes.io/projected/df072e70-d36d-4a5b-b289-7a8f6b3587cf-kube-api-access-4flvb\") pod \"ironic-operator-controller-manager-6f787dddc9-8pjtv\" (UID: \"df072e70-d36d-4a5b-b289-7a8f6b3587cf\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: I0319 12:24:04.892884 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: E0319 12:24:04.893046 29612 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:04.894750 master-0 kubenswrapper[29612]: E0319 12:24:04.893096 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert podName:7398e347-55cd-481c-9240-27e7e6a8e8ba nodeName:}" failed. No retries permitted until 2026-03-19 12:24:05.393080503 +0000 UTC m=+852.707249524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert") pod "infra-operator-controller-manager-7dd6bb94c9-hvsqq" (UID: "7398e347-55cd-481c-9240-27e7e6a8e8ba") : secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:04.899761 master-0 kubenswrapper[29612]: I0319 12:24:04.896840 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt"] Mar 19 12:24:04.899761 master-0 kubenswrapper[29612]: I0319 12:24:04.898044 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:04.959647 master-0 kubenswrapper[29612]: I0319 12:24:04.956191 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zht66\" (UniqueName: \"kubernetes.io/projected/7398e347-55cd-481c-9240-27e7e6a8e8ba-kube-api-access-zht66\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:04.960333 master-0 kubenswrapper[29612]: I0319 12:24:04.960303 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:04.992241 master-0 kubenswrapper[29612]: I0319 12:24:04.992184 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-6chjf"] Mar 19 12:24:04.994146 master-0 kubenswrapper[29612]: I0319 12:24:04.993519 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrbz\" (UniqueName: \"kubernetes.io/projected/0f3d31e4-66ef-4e51-a1d3-c2a3528f533c-kube-api-access-rlrbz\") pod \"horizon-operator-controller-manager-8464cc45fb-d6r29\" (UID: \"0f3d31e4-66ef-4e51-a1d3-c2a3528f533c\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:04.994390 master-0 kubenswrapper[29612]: I0319 12:24:04.994351 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk6pw\" (UniqueName: \"kubernetes.io/projected/8b552910-e16b-4d0f-a672-e102464441ee-kube-api-access-mk6pw\") pod \"keystone-operator-controller-manager-768b96df4c-8b6qs\" (UID: \"8b552910-e16b-4d0f-a672-e102464441ee\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:04.994473 master-0 kubenswrapper[29612]: I0319 12:24:04.994400 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjql6\" (UniqueName: \"kubernetes.io/projected/d54e5e4f-98ea-4896-848f-108d2f56e796-kube-api-access-hjql6\") pod \"mariadb-operator-controller-manager-67ccfc9778-tm2wt\" (UID: \"d54e5e4f-98ea-4896-848f-108d2f56e796\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:04.994473 master-0 kubenswrapper[29612]: I0319 12:24:04.994437 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdzz\" (UniqueName: \"kubernetes.io/projected/c3727fb0-72dd-4b34-bf14-5a972fe68e65-kube-api-access-4tdzz\") pod \"manila-operator-controller-manager-55f864c847-6chjf\" (UID: \"c3727fb0-72dd-4b34-bf14-5a972fe68e65\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:04.994473 master-0 kubenswrapper[29612]: I0319 12:24:04.994458 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4flvb\" (UniqueName: \"kubernetes.io/projected/df072e70-d36d-4a5b-b289-7a8f6b3587cf-kube-api-access-4flvb\") pod \"ironic-operator-controller-manager-6f787dddc9-8pjtv\" (UID: \"df072e70-d36d-4a5b-b289-7a8f6b3587cf\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:05.010495 master-0 kubenswrapper[29612]: I0319 12:24:05.010445 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:05.037925 master-0 kubenswrapper[29612]: I0319 12:24:05.037159 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk6pw\" (UniqueName: \"kubernetes.io/projected/8b552910-e16b-4d0f-a672-e102464441ee-kube-api-access-mk6pw\") pod \"keystone-operator-controller-manager-768b96df4c-8b6qs\" (UID: \"8b552910-e16b-4d0f-a672-e102464441ee\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:05.067395 master-0 kubenswrapper[29612]: I0319 12:24:05.065922 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4flvb\" (UniqueName: \"kubernetes.io/projected/df072e70-d36d-4a5b-b289-7a8f6b3587cf-kube-api-access-4flvb\") pod \"ironic-operator-controller-manager-6f787dddc9-8pjtv\" (UID: \"df072e70-d36d-4a5b-b289-7a8f6b3587cf\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:05.099049 master-0 kubenswrapper[29612]: I0319 12:24:05.088880 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt"] Mar 19 12:24:05.099049 master-0 kubenswrapper[29612]: I0319 12:24:05.095578 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjql6\" (UniqueName: \"kubernetes.io/projected/d54e5e4f-98ea-4896-848f-108d2f56e796-kube-api-access-hjql6\") pod \"mariadb-operator-controller-manager-67ccfc9778-tm2wt\" (UID: \"d54e5e4f-98ea-4896-848f-108d2f56e796\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:05.099049 master-0 kubenswrapper[29612]: I0319 12:24:05.095657 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdzz\" (UniqueName: \"kubernetes.io/projected/c3727fb0-72dd-4b34-bf14-5a972fe68e65-kube-api-access-4tdzz\") pod \"manila-operator-controller-manager-55f864c847-6chjf\" (UID: \"c3727fb0-72dd-4b34-bf14-5a972fe68e65\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:05.102410 master-0 kubenswrapper[29612]: I0319 12:24:05.101795 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k45mg"] Mar 19 12:24:05.105589 master-0 kubenswrapper[29612]: I0319 12:24:05.105388 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:05.167599 master-0 kubenswrapper[29612]: I0319 12:24:05.167388 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdzz\" (UniqueName: \"kubernetes.io/projected/c3727fb0-72dd-4b34-bf14-5a972fe68e65-kube-api-access-4tdzz\") pod \"manila-operator-controller-manager-55f864c847-6chjf\" (UID: \"c3727fb0-72dd-4b34-bf14-5a972fe68e65\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:05.171672 master-0 kubenswrapper[29612]: I0319 12:24:05.170512 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:05.173490 master-0 kubenswrapper[29612]: I0319 12:24:05.172736 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjql6\" (UniqueName: \"kubernetes.io/projected/d54e5e4f-98ea-4896-848f-108d2f56e796-kube-api-access-hjql6\") pod \"mariadb-operator-controller-manager-67ccfc9778-tm2wt\" (UID: \"d54e5e4f-98ea-4896-848f-108d2f56e796\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:05.186033 master-0 kubenswrapper[29612]: I0319 12:24:05.185738 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k45mg"] Mar 19 12:24:05.239687 master-0 kubenswrapper[29612]: I0319 12:24:05.228773 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pklp\" (UniqueName: \"kubernetes.io/projected/52482a1e-f47f-4516-800f-b3b839699cb8-kube-api-access-4pklp\") pod \"neutron-operator-controller-manager-767865f676-k45mg\" (UID: \"52482a1e-f47f-4516-800f-b3b839699cb8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:05.267895 master-0 kubenswrapper[29612]: I0319 12:24:05.264891 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg"] Mar 19 12:24:05.267895 master-0 kubenswrapper[29612]: I0319 12:24:05.266176 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:05.280932 master-0 kubenswrapper[29612]: I0319 12:24:05.279245 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:05.295788 master-0 kubenswrapper[29612]: I0319 12:24:05.295525 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg"] Mar 19 12:24:05.312670 master-0 kubenswrapper[29612]: I0319 12:24:05.307788 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7"] Mar 19 12:24:05.312670 master-0 kubenswrapper[29612]: I0319 12:24:05.308970 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:05.324884 master-0 kubenswrapper[29612]: I0319 12:24:05.323814 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:05.340190 master-0 kubenswrapper[29612]: I0319 12:24:05.336690 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pklp\" (UniqueName: \"kubernetes.io/projected/52482a1e-f47f-4516-800f-b3b839699cb8-kube-api-access-4pklp\") pod \"neutron-operator-controller-manager-767865f676-k45mg\" (UID: \"52482a1e-f47f-4516-800f-b3b839699cb8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:05.340190 master-0 kubenswrapper[29612]: I0319 12:24:05.337116 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:05.340190 master-0 kubenswrapper[29612]: I0319 12:24:05.340144 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7"] Mar 19 12:24:05.354427 master-0 kubenswrapper[29612]: I0319 12:24:05.353046 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2"] Mar 19 12:24:05.360687 master-0 kubenswrapper[29612]: I0319 12:24:05.354650 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:05.384379 master-0 kubenswrapper[29612]: I0319 12:24:05.382459 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 12:24:05.390818 master-0 kubenswrapper[29612]: I0319 12:24:05.390374 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2"] Mar 19 12:24:05.394282 master-0 kubenswrapper[29612]: I0319 12:24:05.394189 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pklp\" (UniqueName: \"kubernetes.io/projected/52482a1e-f47f-4516-800f-b3b839699cb8-kube-api-access-4pklp\") pod \"neutron-operator-controller-manager-767865f676-k45mg\" (UID: \"52482a1e-f47f-4516-800f-b3b839699cb8\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:05.408062 master-0 kubenswrapper[29612]: I0319 12:24:05.405455 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-csxnc"] Mar 19 12:24:05.408062 master-0 kubenswrapper[29612]: I0319 12:24:05.407228 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:05.431817 master-0 kubenswrapper[29612]: I0319 12:24:05.431750 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4564q"] Mar 19 12:24:05.440682 master-0 kubenswrapper[29612]: I0319 12:24:05.432953 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:05.440682 master-0 kubenswrapper[29612]: I0319 12:24:05.439217 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-csxnc"] Mar 19 12:24:05.440682 master-0 kubenswrapper[29612]: I0319 12:24:05.440405 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfgqz\" (UniqueName: \"kubernetes.io/projected/52c88b85-7e59-4c34-aaa6-47cbbb966eea-kube-api-access-kfgqz\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:05.440682 master-0 kubenswrapper[29612]: I0319 12:24:05.440495 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg9qn\" (UniqueName: \"kubernetes.io/projected/7b6191f9-f638-49fc-9336-21a795a51dd8-kube-api-access-tg9qn\") pod \"nova-operator-controller-manager-5d488d59fb-rsrkg\" (UID: \"7b6191f9-f638-49fc-9336-21a795a51dd8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:05.440682 master-0 kubenswrapper[29612]: I0319 12:24:05.440524 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4tzv\" (UniqueName: \"kubernetes.io/projected/28006f2f-ee6b-46fe-a29e-7d9f140a2d1c-kube-api-access-m4tzv\") pod \"octavia-operator-controller-manager-5b9f45d989-4f7d7\" (UID: \"28006f2f-ee6b-46fe-a29e-7d9f140a2d1c\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:05.440682 master-0 kubenswrapper[29612]: I0319 12:24:05.440589 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:05.441515 master-0 kubenswrapper[29612]: I0319 12:24:05.440743 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:05.441515 master-0 kubenswrapper[29612]: E0319 12:24:05.441219 29612 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:05.441515 master-0 kubenswrapper[29612]: E0319 12:24:05.441263 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert podName:7398e347-55cd-481c-9240-27e7e6a8e8ba nodeName:}" failed. No retries permitted until 2026-03-19 12:24:06.441249681 +0000 UTC m=+853.755418702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert") pod "infra-operator-controller-manager-7dd6bb94c9-hvsqq" (UID: "7398e347-55cd-481c-9240-27e7e6a8e8ba") : secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:05.449870 master-0 kubenswrapper[29612]: I0319 12:24:05.449736 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4564q"] Mar 19 12:24:05.466383 master-0 kubenswrapper[29612]: I0319 12:24:05.465891 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-8llrg"] Mar 19 12:24:05.467351 master-0 kubenswrapper[29612]: I0319 12:24:05.467311 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:05.486165 master-0 kubenswrapper[29612]: I0319 12:24:05.484530 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-8llrg"] Mar 19 12:24:05.532679 master-0 kubenswrapper[29612]: I0319 12:24:05.528589 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:05.544673 master-0 kubenswrapper[29612]: I0319 12:24:05.543821 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44"] Mar 19 12:24:05.544859 master-0 kubenswrapper[29612]: I0319 12:24:05.544769 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt5zz\" (UniqueName: \"kubernetes.io/projected/85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2-kube-api-access-zt5zz\") pod \"ovn-operator-controller-manager-884679f54-csxnc\" (UID: \"85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:05.545562 master-0 kubenswrapper[29612]: I0319 12:24:05.545334 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: I0319 12:24:05.546122 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfgqz\" (UniqueName: \"kubernetes.io/projected/52c88b85-7e59-4c34-aaa6-47cbbb966eea-kube-api-access-kfgqz\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: I0319 12:24:05.546313 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg9qn\" (UniqueName: \"kubernetes.io/projected/7b6191f9-f638-49fc-9336-21a795a51dd8-kube-api-access-tg9qn\") pod \"nova-operator-controller-manager-5d488d59fb-rsrkg\" (UID: \"7b6191f9-f638-49fc-9336-21a795a51dd8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: I0319 12:24:05.546347 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4tzv\" (UniqueName: \"kubernetes.io/projected/28006f2f-ee6b-46fe-a29e-7d9f140a2d1c-kube-api-access-m4tzv\") pod \"octavia-operator-controller-manager-5b9f45d989-4f7d7\" (UID: \"28006f2f-ee6b-46fe-a29e-7d9f140a2d1c\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: I0319 12:24:05.546376 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: I0319 12:24:05.546406 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbnpk\" (UniqueName: \"kubernetes.io/projected/9e3c1457-2f62-4ea9-8782-2cd197683d38-kube-api-access-kbnpk\") pod \"placement-operator-controller-manager-5784578c99-4564q\" (UID: \"9e3c1457-2f62-4ea9-8782-2cd197683d38\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: E0319 12:24:05.546894 29612 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:05.548665 master-0 kubenswrapper[29612]: E0319 12:24:05.546953 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert podName:52c88b85-7e59-4c34-aaa6-47cbbb966eea nodeName:}" failed. No retries permitted until 2026-03-19 12:24:06.046935864 +0000 UTC m=+853.361104885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mz9v2" (UID: "52c88b85-7e59-4c34-aaa6-47cbbb966eea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:05.555441 master-0 kubenswrapper[29612]: I0319 12:24:05.552475 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44"] Mar 19 12:24:05.577311 master-0 kubenswrapper[29612]: I0319 12:24:05.577002 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p"] Mar 19 12:24:05.578715 master-0 kubenswrapper[29612]: I0319 12:24:05.578169 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:05.593660 master-0 kubenswrapper[29612]: I0319 12:24:05.593286 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p"] Mar 19 12:24:05.598899 master-0 kubenswrapper[29612]: I0319 12:24:05.598399 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4tzv\" (UniqueName: \"kubernetes.io/projected/28006f2f-ee6b-46fe-a29e-7d9f140a2d1c-kube-api-access-m4tzv\") pod \"octavia-operator-controller-manager-5b9f45d989-4f7d7\" (UID: \"28006f2f-ee6b-46fe-a29e-7d9f140a2d1c\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:05.606022 master-0 kubenswrapper[29612]: I0319 12:24:05.603152 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j"] Mar 19 12:24:05.606022 master-0 kubenswrapper[29612]: I0319 12:24:05.604799 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:05.611757 master-0 kubenswrapper[29612]: I0319 12:24:05.611402 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfgqz\" (UniqueName: \"kubernetes.io/projected/52c88b85-7e59-4c34-aaa6-47cbbb966eea-kube-api-access-kfgqz\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:05.612093 master-0 kubenswrapper[29612]: I0319 12:24:05.612066 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg9qn\" (UniqueName: \"kubernetes.io/projected/7b6191f9-f638-49fc-9336-21a795a51dd8-kube-api-access-tg9qn\") pod \"nova-operator-controller-manager-5d488d59fb-rsrkg\" (UID: \"7b6191f9-f638-49fc-9336-21a795a51dd8\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:05.614572 master-0 kubenswrapper[29612]: I0319 12:24:05.614507 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j"] Mar 19 12:24:05.633445 master-0 kubenswrapper[29612]: I0319 12:24:05.632815 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm"] Mar 19 12:24:05.634501 master-0 kubenswrapper[29612]: I0319 12:24:05.634482 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.637191 master-0 kubenswrapper[29612]: I0319 12:24:05.637166 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 12:24:05.638080 master-0 kubenswrapper[29612]: I0319 12:24:05.637406 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 12:24:05.650328 master-0 kubenswrapper[29612]: I0319 12:24:05.650164 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt5zz\" (UniqueName: \"kubernetes.io/projected/85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2-kube-api-access-zt5zz\") pod \"ovn-operator-controller-manager-884679f54-csxnc\" (UID: \"85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:05.650328 master-0 kubenswrapper[29612]: I0319 12:24:05.650268 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcs49\" (UniqueName: \"kubernetes.io/projected/aaef999b-7d65-4ae6-9009-0006983f05b9-kube-api-access-jcs49\") pod \"telemetry-operator-controller-manager-d6b694c5-5gc44\" (UID: \"aaef999b-7d65-4ae6-9009-0006983f05b9\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:05.650328 master-0 kubenswrapper[29612]: I0319 12:24:05.650305 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlqn\" (UniqueName: \"kubernetes.io/projected/a63e044e-9f48-42a5-8bec-91ba8d69a724-kube-api-access-7jlqn\") pod \"swift-operator-controller-manager-c674c5965-8llrg\" (UID: \"a63e044e-9f48-42a5-8bec-91ba8d69a724\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:05.650475 master-0 kubenswrapper[29612]: I0319 12:24:05.650362 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbnpk\" (UniqueName: \"kubernetes.io/projected/9e3c1457-2f62-4ea9-8782-2cd197683d38-kube-api-access-kbnpk\") pod \"placement-operator-controller-manager-5784578c99-4564q\" (UID: \"9e3c1457-2f62-4ea9-8782-2cd197683d38\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:05.658823 master-0 kubenswrapper[29612]: I0319 12:24:05.658320 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm"] Mar 19 12:24:05.691956 master-0 kubenswrapper[29612]: I0319 12:24:05.691874 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:05.717544 master-0 kubenswrapper[29612]: I0319 12:24:05.715015 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt5zz\" (UniqueName: \"kubernetes.io/projected/85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2-kube-api-access-zt5zz\") pod \"ovn-operator-controller-manager-884679f54-csxnc\" (UID: \"85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:05.718534 master-0 kubenswrapper[29612]: I0319 12:24:05.718494 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbnpk\" (UniqueName: \"kubernetes.io/projected/9e3c1457-2f62-4ea9-8782-2cd197683d38-kube-api-access-kbnpk\") pod \"placement-operator-controller-manager-5784578c99-4564q\" (UID: \"9e3c1457-2f62-4ea9-8782-2cd197683d38\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:05.740191 master-0 kubenswrapper[29612]: I0319 12:24:05.738257 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm"] Mar 19 12:24:05.740191 master-0 kubenswrapper[29612]: I0319 12:24:05.739384 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" Mar 19 12:24:05.745502 master-0 kubenswrapper[29612]: I0319 12:24:05.745092 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:05.751725 master-0 kubenswrapper[29612]: I0319 12:24:05.751661 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcs49\" (UniqueName: \"kubernetes.io/projected/aaef999b-7d65-4ae6-9009-0006983f05b9-kube-api-access-jcs49\") pod \"telemetry-operator-controller-manager-d6b694c5-5gc44\" (UID: \"aaef999b-7d65-4ae6-9009-0006983f05b9\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:05.752137 master-0 kubenswrapper[29612]: I0319 12:24:05.752085 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlqn\" (UniqueName: \"kubernetes.io/projected/a63e044e-9f48-42a5-8bec-91ba8d69a724-kube-api-access-7jlqn\") pod \"swift-operator-controller-manager-c674c5965-8llrg\" (UID: \"a63e044e-9f48-42a5-8bec-91ba8d69a724\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:05.752358 master-0 kubenswrapper[29612]: I0319 12:24:05.752139 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.752358 master-0 kubenswrapper[29612]: I0319 12:24:05.752205 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sqwf\" (UniqueName: \"kubernetes.io/projected/f95c7d97-fc0c-47a4-aeab-32258cb1ec8e-kube-api-access-4sqwf\") pod \"test-operator-controller-manager-5c5cb9c4d7-r6x2p\" (UID: \"f95c7d97-fc0c-47a4-aeab-32258cb1ec8e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:05.752358 master-0 kubenswrapper[29612]: I0319 12:24:05.752299 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.752481 master-0 kubenswrapper[29612]: I0319 12:24:05.752366 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdm8t\" (UniqueName: \"kubernetes.io/projected/c2716611-c098-41f2-8dd3-72e2bbac6dc9-kube-api-access-kdm8t\") pod \"watcher-operator-controller-manager-6c4d75f7f9-qgb4j\" (UID: \"c2716611-c098-41f2-8dd3-72e2bbac6dc9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:05.752481 master-0 kubenswrapper[29612]: I0319 12:24:05.752411 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxx6r\" (UniqueName: \"kubernetes.io/projected/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-kube-api-access-hxx6r\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.756937 master-0 kubenswrapper[29612]: I0319 12:24:05.755731 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm"] Mar 19 12:24:05.777521 master-0 kubenswrapper[29612]: I0319 12:24:05.774331 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcs49\" (UniqueName: \"kubernetes.io/projected/aaef999b-7d65-4ae6-9009-0006983f05b9-kube-api-access-jcs49\") pod \"telemetry-operator-controller-manager-d6b694c5-5gc44\" (UID: \"aaef999b-7d65-4ae6-9009-0006983f05b9\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:05.783061 master-0 kubenswrapper[29612]: I0319 12:24:05.782998 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:05.799117 master-0 kubenswrapper[29612]: I0319 12:24:05.797734 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:05.799117 master-0 kubenswrapper[29612]: I0319 12:24:05.798805 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" event={"ID":"f77e9b9d-e1fe-411c-a005-be21205ebc29","Type":"ContainerStarted","Data":"6c535834cce7d1d6f49b6a6e51e7113a4098e0f7ee95b8798c8aad77917b66fb"} Mar 19 12:24:05.808150 master-0 kubenswrapper[29612]: I0319 12:24:05.806811 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlqn\" (UniqueName: \"kubernetes.io/projected/a63e044e-9f48-42a5-8bec-91ba8d69a724-kube-api-access-7jlqn\") pod \"swift-operator-controller-manager-c674c5965-8llrg\" (UID: \"a63e044e-9f48-42a5-8bec-91ba8d69a724\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:05.837531 master-0 kubenswrapper[29612]: I0319 12:24:05.837478 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx"] Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: I0319 12:24:05.855820 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sqwf\" (UniqueName: \"kubernetes.io/projected/f95c7d97-fc0c-47a4-aeab-32258cb1ec8e-kube-api-access-4sqwf\") pod \"test-operator-controller-manager-5c5cb9c4d7-r6x2p\" (UID: \"f95c7d97-fc0c-47a4-aeab-32258cb1ec8e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: I0319 12:24:05.855992 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrc4\" (UniqueName: \"kubernetes.io/projected/6e25fc95-0475-4395-952b-59db3cd91ad7-kube-api-access-ccrc4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jhqlm\" (UID: \"6e25fc95-0475-4395-952b-59db3cd91ad7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: I0319 12:24:05.856027 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: I0319 12:24:05.856081 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdm8t\" (UniqueName: \"kubernetes.io/projected/c2716611-c098-41f2-8dd3-72e2bbac6dc9-kube-api-access-kdm8t\") pod \"watcher-operator-controller-manager-6c4d75f7f9-qgb4j\" (UID: \"c2716611-c098-41f2-8dd3-72e2bbac6dc9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: E0319 12:24:05.856440 29612 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: E0319 12:24:05.856508 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:06.35649031 +0000 UTC m=+853.670659331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "webhook-server-cert" not found Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: I0319 12:24:05.856505 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxx6r\" (UniqueName: \"kubernetes.io/projected/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-kube-api-access-hxx6r\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: I0319 12:24:05.856912 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: E0319 12:24:05.857049 29612 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 12:24:05.859980 master-0 kubenswrapper[29612]: E0319 12:24:05.857271 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:06.357089857 +0000 UTC m=+853.671258888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "metrics-server-cert" not found Mar 19 12:24:05.887236 master-0 kubenswrapper[29612]: I0319 12:24:05.875702 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:05.892181 master-0 kubenswrapper[29612]: I0319 12:24:05.892121 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdm8t\" (UniqueName: \"kubernetes.io/projected/c2716611-c098-41f2-8dd3-72e2bbac6dc9-kube-api-access-kdm8t\") pod \"watcher-operator-controller-manager-6c4d75f7f9-qgb4j\" (UID: \"c2716611-c098-41f2-8dd3-72e2bbac6dc9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:05.894592 master-0 kubenswrapper[29612]: I0319 12:24:05.894503 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxx6r\" (UniqueName: \"kubernetes.io/projected/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-kube-api-access-hxx6r\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:05.910684 master-0 kubenswrapper[29612]: I0319 12:24:05.907300 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sqwf\" (UniqueName: \"kubernetes.io/projected/f95c7d97-fc0c-47a4-aeab-32258cb1ec8e-kube-api-access-4sqwf\") pod \"test-operator-controller-manager-5c5cb9c4d7-r6x2p\" (UID: \"f95c7d97-fc0c-47a4-aeab-32258cb1ec8e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:05.987724 master-0 kubenswrapper[29612]: I0319 12:24:05.978955 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrc4\" (UniqueName: \"kubernetes.io/projected/6e25fc95-0475-4395-952b-59db3cd91ad7-kube-api-access-ccrc4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jhqlm\" (UID: \"6e25fc95-0475-4395-952b-59db3cd91ad7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" Mar 19 12:24:06.008682 master-0 kubenswrapper[29612]: I0319 12:24:06.006805 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:06.008682 master-0 kubenswrapper[29612]: I0319 12:24:06.007251 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrc4\" (UniqueName: \"kubernetes.io/projected/6e25fc95-0475-4395-952b-59db3cd91ad7-kube-api-access-ccrc4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-jhqlm\" (UID: \"6e25fc95-0475-4395-952b-59db3cd91ad7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" Mar 19 12:24:06.089202 master-0 kubenswrapper[29612]: I0319 12:24:06.088736 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:06.092359 master-0 kubenswrapper[29612]: E0319 12:24:06.092290 29612 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:06.092507 master-0 kubenswrapper[29612]: E0319 12:24:06.092379 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert podName:52c88b85-7e59-4c34-aaa6-47cbbb966eea nodeName:}" failed. No retries permitted until 2026-03-19 12:24:07.092357273 +0000 UTC m=+854.406526294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mz9v2" (UID: "52c88b85-7e59-4c34-aaa6-47cbbb966eea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:06.099801 master-0 kubenswrapper[29612]: I0319 12:24:06.099763 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:06.165513 master-0 kubenswrapper[29612]: I0319 12:24:06.162543 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:06.165513 master-0 kubenswrapper[29612]: I0319 12:24:06.163110 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs"] Mar 19 12:24:06.197482 master-0 kubenswrapper[29612]: I0319 12:24:06.197062 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" Mar 19 12:24:06.236146 master-0 kubenswrapper[29612]: I0319 12:24:06.230783 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8"] Mar 19 12:24:06.286829 master-0 kubenswrapper[29612]: I0319 12:24:06.285830 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq"] Mar 19 12:24:06.421456 master-0 kubenswrapper[29612]: I0319 12:24:06.392852 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:06.421456 master-0 kubenswrapper[29612]: I0319 12:24:06.392963 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:06.421456 master-0 kubenswrapper[29612]: E0319 12:24:06.395034 29612 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 12:24:06.421456 master-0 kubenswrapper[29612]: E0319 12:24:06.395119 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:07.395101106 +0000 UTC m=+854.709270187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "metrics-server-cert" not found Mar 19 12:24:06.421456 master-0 kubenswrapper[29612]: E0319 12:24:06.393130 29612 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 12:24:06.421456 master-0 kubenswrapper[29612]: E0319 12:24:06.403048 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:07.402651261 +0000 UTC m=+854.716820282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "webhook-server-cert" not found Mar 19 12:24:06.496065 master-0 kubenswrapper[29612]: I0319 12:24:06.495891 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:06.496272 master-0 kubenswrapper[29612]: E0319 12:24:06.496114 29612 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:06.496272 master-0 kubenswrapper[29612]: E0319 12:24:06.496202 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert podName:7398e347-55cd-481c-9240-27e7e6a8e8ba nodeName:}" failed. No retries permitted until 2026-03-19 12:24:08.496183739 +0000 UTC m=+855.810352760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert") pod "infra-operator-controller-manager-7dd6bb94c9-hvsqq" (UID: "7398e347-55cd-481c-9240-27e7e6a8e8ba") : secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:06.687747 master-0 kubenswrapper[29612]: I0319 12:24:06.684127 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv"] Mar 19 12:24:06.711580 master-0 kubenswrapper[29612]: I0319 12:24:06.711522 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd"] Mar 19 12:24:06.725150 master-0 kubenswrapper[29612]: I0319 12:24:06.725066 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt"] Mar 19 12:24:06.734266 master-0 kubenswrapper[29612]: I0319 12:24:06.734187 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-6chjf"] Mar 19 12:24:06.736053 master-0 kubenswrapper[29612]: W0319 12:24:06.735963 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54e5e4f_98ea_4896_848f_108d2f56e796.slice/crio-32a3017bee8e89c6a22edd742fdacd3508c02d0f03138c600071d345e21e11fa WatchSource:0}: Error finding container 32a3017bee8e89c6a22edd742fdacd3508c02d0f03138c600071d345e21e11fa: Status 404 returned error can't find the container with id 32a3017bee8e89c6a22edd742fdacd3508c02d0f03138c600071d345e21e11fa Mar 19 12:24:06.740692 master-0 kubenswrapper[29612]: W0319 12:24:06.740617 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod903e6a5b_d8ec_4a23_bc47_8428e998422f.slice/crio-e5c3750675566327911273b68ca4bd0b19e5020d5b31e9e15608376f1eeb7f62 WatchSource:0}: Error finding container e5c3750675566327911273b68ca4bd0b19e5020d5b31e9e15608376f1eeb7f62: Status 404 returned error can't find the container with id e5c3750675566327911273b68ca4bd0b19e5020d5b31e9e15608376f1eeb7f62 Mar 19 12:24:06.742954 master-0 kubenswrapper[29612]: I0319 12:24:06.742923 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29"] Mar 19 12:24:06.757403 master-0 kubenswrapper[29612]: W0319 12:24:06.757361 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3d31e4_66ef_4e51_a1d3_c2a3528f533c.slice/crio-7db18a2bbee73165000fdd00aaafc50e62d24b9c94c2ca6d000bb8270e561e7b WatchSource:0}: Error finding container 7db18a2bbee73165000fdd00aaafc50e62d24b9c94c2ca6d000bb8270e561e7b: Status 404 returned error can't find the container with id 7db18a2bbee73165000fdd00aaafc50e62d24b9c94c2ca6d000bb8270e561e7b Mar 19 12:24:06.758754 master-0 kubenswrapper[29612]: W0319 12:24:06.758676 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b6c38e_a5ea_47ad_ac78_dcb138979d96.slice/crio-929aedfbfe742b9f4bd20b6d38105c88e2f0168827bef458f9f990a55426425b WatchSource:0}: Error finding container 929aedfbfe742b9f4bd20b6d38105c88e2f0168827bef458f9f990a55426425b: Status 404 returned error can't find the container with id 929aedfbfe742b9f4bd20b6d38105c88e2f0168827bef458f9f990a55426425b Mar 19 12:24:06.759351 master-0 kubenswrapper[29612]: I0319 12:24:06.759320 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn"] Mar 19 12:24:06.762142 master-0 kubenswrapper[29612]: W0319 12:24:06.761017 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3727fb0_72dd_4b34_bf14_5a972fe68e65.slice/crio-2e5a5da251a60fc6995eb47fcf09ba2f8b515eaff7d33e4e756aa12696c0be51 WatchSource:0}: Error finding container 2e5a5da251a60fc6995eb47fcf09ba2f8b515eaff7d33e4e756aa12696c0be51: Status 404 returned error can't find the container with id 2e5a5da251a60fc6995eb47fcf09ba2f8b515eaff7d33e4e756aa12696c0be51 Mar 19 12:24:06.814693 master-0 kubenswrapper[29612]: I0319 12:24:06.814487 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" event={"ID":"df072e70-d36d-4a5b-b289-7a8f6b3587cf","Type":"ContainerStarted","Data":"9752160aa9adae21fa9cbc509b6b159585a3f63940f65cf92507491063a1318d"} Mar 19 12:24:06.816700 master-0 kubenswrapper[29612]: I0319 12:24:06.816676 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" event={"ID":"fa2bb2e9-5b88-4c4d-8abe-2837c40e5880","Type":"ContainerStarted","Data":"4d52ddb231c21481cada424d35d6bc206be3396f4fc70ea9a195a90f34339a80"} Mar 19 12:24:06.821288 master-0 kubenswrapper[29612]: I0319 12:24:06.821215 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" event={"ID":"d54e5e4f-98ea-4896-848f-108d2f56e796","Type":"ContainerStarted","Data":"32a3017bee8e89c6a22edd742fdacd3508c02d0f03138c600071d345e21e11fa"} Mar 19 12:24:06.822691 master-0 kubenswrapper[29612]: I0319 12:24:06.822650 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" event={"ID":"8b552910-e16b-4d0f-a672-e102464441ee","Type":"ContainerStarted","Data":"fd74f34b5d889fb2a165b8de3d4f3b8348626422754bb8e21fab07e3e6708d72"} Mar 19 12:24:06.824519 master-0 kubenswrapper[29612]: I0319 12:24:06.824331 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" event={"ID":"903e6a5b-d8ec-4a23-bc47-8428e998422f","Type":"ContainerStarted","Data":"e5c3750675566327911273b68ca4bd0b19e5020d5b31e9e15608376f1eeb7f62"} Mar 19 12:24:06.826194 master-0 kubenswrapper[29612]: I0319 12:24:06.826152 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" event={"ID":"c3727fb0-72dd-4b34-bf14-5a972fe68e65","Type":"ContainerStarted","Data":"2e5a5da251a60fc6995eb47fcf09ba2f8b515eaff7d33e4e756aa12696c0be51"} Mar 19 12:24:06.842522 master-0 kubenswrapper[29612]: I0319 12:24:06.841687 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" event={"ID":"e6648cf7-2ce1-4994-9edd-037386b885e7","Type":"ContainerStarted","Data":"6defec43026a11de35ebdd3a6513020ede7cdc21621176cad2a9ac64ece24a78"} Mar 19 12:24:06.851804 master-0 kubenswrapper[29612]: I0319 12:24:06.851741 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" event={"ID":"35b6c38e-a5ea-47ad-ac78-dcb138979d96","Type":"ContainerStarted","Data":"929aedfbfe742b9f4bd20b6d38105c88e2f0168827bef458f9f990a55426425b"} Mar 19 12:24:06.852951 master-0 kubenswrapper[29612]: I0319 12:24:06.852926 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" event={"ID":"0f3d31e4-66ef-4e51-a1d3-c2a3528f533c","Type":"ContainerStarted","Data":"7db18a2bbee73165000fdd00aaafc50e62d24b9c94c2ca6d000bb8270e561e7b"} Mar 19 12:24:06.958997 master-0 kubenswrapper[29612]: W0319 12:24:06.956356 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52482a1e_f47f_4516_800f_b3b839699cb8.slice/crio-89ee3030b469ed27b2deba24811400880c78d2bae32210071c3ede37004b5965 WatchSource:0}: Error finding container 89ee3030b469ed27b2deba24811400880c78d2bae32210071c3ede37004b5965: Status 404 returned error can't find the container with id 89ee3030b469ed27b2deba24811400880c78d2bae32210071c3ede37004b5965 Mar 19 12:24:06.973366 master-0 kubenswrapper[29612]: I0319 12:24:06.973264 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg"] Mar 19 12:24:06.985886 master-0 kubenswrapper[29612]: I0319 12:24:06.985824 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-k45mg"] Mar 19 12:24:07.135090 master-0 kubenswrapper[29612]: I0319 12:24:07.114551 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:07.135090 master-0 kubenswrapper[29612]: E0319 12:24:07.114808 29612 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:07.135090 master-0 kubenswrapper[29612]: E0319 12:24:07.114861 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert podName:52c88b85-7e59-4c34-aaa6-47cbbb966eea nodeName:}" failed. No retries permitted until 2026-03-19 12:24:09.114846238 +0000 UTC m=+856.429015259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mz9v2" (UID: "52c88b85-7e59-4c34-aaa6-47cbbb966eea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:07.420087 master-0 kubenswrapper[29612]: I0319 12:24:07.420034 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:07.420343 master-0 kubenswrapper[29612]: I0319 12:24:07.420183 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:07.420343 master-0 kubenswrapper[29612]: E0319 12:24:07.420217 29612 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 12:24:07.420343 master-0 kubenswrapper[29612]: E0319 12:24:07.420297 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:09.420274327 +0000 UTC m=+856.734443398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "webhook-server-cert" not found Mar 19 12:24:07.420745 master-0 kubenswrapper[29612]: E0319 12:24:07.420388 29612 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 12:24:07.420745 master-0 kubenswrapper[29612]: E0319 12:24:07.420446 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:09.420432662 +0000 UTC m=+856.734601683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "metrics-server-cert" not found Mar 19 12:24:07.672479 master-0 kubenswrapper[29612]: I0319 12:24:07.672263 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-4564q"] Mar 19 12:24:07.685185 master-0 kubenswrapper[29612]: I0319 12:24:07.685125 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7"] Mar 19 12:24:07.702678 master-0 kubenswrapper[29612]: W0319 12:24:07.702595 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28006f2f_ee6b_46fe_a29e_7d9f140a2d1c.slice/crio-83233ae1714a6f218030bd31c469cfbbfa8497cb7a71acea8853d72856551a29 WatchSource:0}: Error finding container 83233ae1714a6f218030bd31c469cfbbfa8497cb7a71acea8853d72856551a29: Status 404 returned error can't find the container with id 83233ae1714a6f218030bd31c469cfbbfa8497cb7a71acea8853d72856551a29 Mar 19 12:24:07.711756 master-0 kubenswrapper[29612]: W0319 12:24:07.710378 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63e044e_9f48_42a5_8bec_91ba8d69a724.slice/crio-0236707695b1e9c49a0d5c965aef3df88f7d91a19f2c941ccaecf218e97e3b16 WatchSource:0}: Error finding container 0236707695b1e9c49a0d5c965aef3df88f7d91a19f2c941ccaecf218e97e3b16: Status 404 returned error can't find the container with id 0236707695b1e9c49a0d5c965aef3df88f7d91a19f2c941ccaecf218e97e3b16 Mar 19 12:24:07.723361 master-0 kubenswrapper[29612]: W0319 12:24:07.722353 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85abdea2_0fc7_4cc3_a74f_0282cdc5d5d2.slice/crio-d3d73c3478ed0cb37482eedac6126dab708eccc76bca70107d2d7b62c45940be WatchSource:0}: Error finding container d3d73c3478ed0cb37482eedac6126dab708eccc76bca70107d2d7b62c45940be: Status 404 returned error can't find the container with id d3d73c3478ed0cb37482eedac6126dab708eccc76bca70107d2d7b62c45940be Mar 19 12:24:07.723361 master-0 kubenswrapper[29612]: W0319 12:24:07.722733 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e25fc95_0475_4395_952b_59db3cd91ad7.slice/crio-bfdcba96bfc4e0564227898e246e69107405849dd33a213c9cb0ed47caa19158 WatchSource:0}: Error finding container bfdcba96bfc4e0564227898e246e69107405849dd33a213c9cb0ed47caa19158: Status 404 returned error can't find the container with id bfdcba96bfc4e0564227898e246e69107405849dd33a213c9cb0ed47caa19158 Mar 19 12:24:07.748947 master-0 kubenswrapper[29612]: E0319 12:24:07.747987 29612 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcs49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-5gc44_openstack-operators(aaef999b-7d65-4ae6-9009-0006983f05b9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 12:24:07.750156 master-0 kubenswrapper[29612]: E0319 12:24:07.749347 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" podUID="aaef999b-7d65-4ae6-9009-0006983f05b9" Mar 19 12:24:07.750156 master-0 kubenswrapper[29612]: E0319 12:24:07.749891 29612 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdm8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-qgb4j_openstack-operators(c2716611-c098-41f2-8dd3-72e2bbac6dc9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 12:24:07.755508 master-0 kubenswrapper[29612]: E0319 12:24:07.754346 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" podUID="c2716611-c098-41f2-8dd3-72e2bbac6dc9" Mar 19 12:24:07.769928 master-0 kubenswrapper[29612]: I0319 12:24:07.769830 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p"] Mar 19 12:24:07.807587 master-0 kubenswrapper[29612]: I0319 12:24:07.807508 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-8llrg"] Mar 19 12:24:07.834257 master-0 kubenswrapper[29612]: I0319 12:24:07.833479 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44"] Mar 19 12:24:07.873738 master-0 kubenswrapper[29612]: I0319 12:24:07.873659 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-csxnc"] Mar 19 12:24:07.874462 master-0 kubenswrapper[29612]: I0319 12:24:07.874415 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" event={"ID":"aaef999b-7d65-4ae6-9009-0006983f05b9","Type":"ContainerStarted","Data":"f807307de69e792907fada2d4a2a65b698db53bbc1b41f496948129d6946409e"} Mar 19 12:24:07.876515 master-0 kubenswrapper[29612]: E0319 12:24:07.876227 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" podUID="aaef999b-7d65-4ae6-9009-0006983f05b9" Mar 19 12:24:07.880436 master-0 kubenswrapper[29612]: I0319 12:24:07.880342 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" event={"ID":"6e25fc95-0475-4395-952b-59db3cd91ad7","Type":"ContainerStarted","Data":"bfdcba96bfc4e0564227898e246e69107405849dd33a213c9cb0ed47caa19158"} Mar 19 12:24:07.882995 master-0 kubenswrapper[29612]: I0319 12:24:07.882942 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" event={"ID":"f95c7d97-fc0c-47a4-aeab-32258cb1ec8e","Type":"ContainerStarted","Data":"17b6eedb235355e5c02ed968d371efea729c054df43d609409ec8448deae178c"} Mar 19 12:24:07.888708 master-0 kubenswrapper[29612]: I0319 12:24:07.887675 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" event={"ID":"c2716611-c098-41f2-8dd3-72e2bbac6dc9","Type":"ContainerStarted","Data":"b11508235c224ffc1de89e0342845797057b6063137545e5bd7b6e1c85ec1542"} Mar 19 12:24:07.890577 master-0 kubenswrapper[29612]: E0319 12:24:07.890210 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" podUID="c2716611-c098-41f2-8dd3-72e2bbac6dc9" Mar 19 12:24:07.908371 master-0 kubenswrapper[29612]: I0319 12:24:07.906225 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" event={"ID":"52482a1e-f47f-4516-800f-b3b839699cb8","Type":"ContainerStarted","Data":"89ee3030b469ed27b2deba24811400880c78d2bae32210071c3ede37004b5965"} Mar 19 12:24:07.924767 master-0 kubenswrapper[29612]: I0319 12:24:07.924264 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm"] Mar 19 12:24:07.928894 master-0 kubenswrapper[29612]: I0319 12:24:07.928760 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" event={"ID":"85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2","Type":"ContainerStarted","Data":"d3d73c3478ed0cb37482eedac6126dab708eccc76bca70107d2d7b62c45940be"} Mar 19 12:24:07.933848 master-0 kubenswrapper[29612]: I0319 12:24:07.933752 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" event={"ID":"a63e044e-9f48-42a5-8bec-91ba8d69a724","Type":"ContainerStarted","Data":"0236707695b1e9c49a0d5c965aef3df88f7d91a19f2c941ccaecf218e97e3b16"} Mar 19 12:24:07.937069 master-0 kubenswrapper[29612]: I0319 12:24:07.936998 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" event={"ID":"9e3c1457-2f62-4ea9-8782-2cd197683d38","Type":"ContainerStarted","Data":"d3eeba995a778ae85de17454c0aaad2cb6e8711f7dbabeb4d62c121237e5b875"} Mar 19 12:24:07.944432 master-0 kubenswrapper[29612]: I0319 12:24:07.942297 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" event={"ID":"28006f2f-ee6b-46fe-a29e-7d9f140a2d1c","Type":"ContainerStarted","Data":"83233ae1714a6f218030bd31c469cfbbfa8497cb7a71acea8853d72856551a29"} Mar 19 12:24:07.944672 master-0 kubenswrapper[29612]: I0319 12:24:07.944437 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" event={"ID":"7b6191f9-f638-49fc-9336-21a795a51dd8","Type":"ContainerStarted","Data":"48f3a493307068a77ff5bedc9e8e48dd2d9789985eb8fcc51273542f4a2f657a"} Mar 19 12:24:07.957147 master-0 kubenswrapper[29612]: I0319 12:24:07.955871 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j"] Mar 19 12:24:08.563035 master-0 kubenswrapper[29612]: I0319 12:24:08.562979 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:08.563898 master-0 kubenswrapper[29612]: E0319 12:24:08.563289 29612 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:08.564050 master-0 kubenswrapper[29612]: E0319 12:24:08.564034 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert podName:7398e347-55cd-481c-9240-27e7e6a8e8ba nodeName:}" failed. No retries permitted until 2026-03-19 12:24:12.564008739 +0000 UTC m=+859.878177760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert") pod "infra-operator-controller-manager-7dd6bb94c9-hvsqq" (UID: "7398e347-55cd-481c-9240-27e7e6a8e8ba") : secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:08.965718 master-0 kubenswrapper[29612]: E0319 12:24:08.965674 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" podUID="c2716611-c098-41f2-8dd3-72e2bbac6dc9" Mar 19 12:24:08.969112 master-0 kubenswrapper[29612]: E0319 12:24:08.969047 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" podUID="aaef999b-7d65-4ae6-9009-0006983f05b9" Mar 19 12:24:09.189527 master-0 kubenswrapper[29612]: I0319 12:24:09.189468 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:09.189868 master-0 kubenswrapper[29612]: E0319 12:24:09.189675 29612 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:09.189868 master-0 kubenswrapper[29612]: E0319 12:24:09.189765 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert podName:52c88b85-7e59-4c34-aaa6-47cbbb966eea nodeName:}" failed. No retries permitted until 2026-03-19 12:24:13.18974694 +0000 UTC m=+860.503915961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mz9v2" (UID: "52c88b85-7e59-4c34-aaa6-47cbbb966eea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:09.500438 master-0 kubenswrapper[29612]: I0319 12:24:09.500387 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:09.500689 master-0 kubenswrapper[29612]: E0319 12:24:09.500583 29612 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 12:24:09.500689 master-0 kubenswrapper[29612]: I0319 12:24:09.500665 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:09.500784 master-0 kubenswrapper[29612]: E0319 12:24:09.500690 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:13.500668055 +0000 UTC m=+860.814837196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "metrics-server-cert" not found Mar 19 12:24:09.500838 master-0 kubenswrapper[29612]: E0319 12:24:09.500821 29612 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 12:24:09.500879 master-0 kubenswrapper[29612]: E0319 12:24:09.500858 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:13.5008459 +0000 UTC m=+860.815015011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "webhook-server-cert" not found Mar 19 12:24:09.974851 master-0 kubenswrapper[29612]: E0319 12:24:09.974620 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" podUID="c2716611-c098-41f2-8dd3-72e2bbac6dc9" Mar 19 12:24:12.602558 master-0 kubenswrapper[29612]: I0319 12:24:12.602497 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:12.603522 master-0 kubenswrapper[29612]: E0319 12:24:12.603016 29612 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:12.603600 master-0 kubenswrapper[29612]: E0319 12:24:12.603581 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert podName:7398e347-55cd-481c-9240-27e7e6a8e8ba nodeName:}" failed. No retries permitted until 2026-03-19 12:24:20.603560739 +0000 UTC m=+867.917729780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert") pod "infra-operator-controller-manager-7dd6bb94c9-hvsqq" (UID: "7398e347-55cd-481c-9240-27e7e6a8e8ba") : secret "infra-operator-webhook-server-cert" not found Mar 19 12:24:13.213794 master-0 kubenswrapper[29612]: I0319 12:24:13.213691 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:13.214029 master-0 kubenswrapper[29612]: E0319 12:24:13.213936 29612 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:13.214082 master-0 kubenswrapper[29612]: E0319 12:24:13.214064 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert podName:52c88b85-7e59-4c34-aaa6-47cbbb966eea nodeName:}" failed. No retries permitted until 2026-03-19 12:24:21.214012716 +0000 UTC m=+868.528181737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mz9v2" (UID: "52c88b85-7e59-4c34-aaa6-47cbbb966eea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:13.518783 master-0 kubenswrapper[29612]: I0319 12:24:13.518525 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:13.519091 master-0 kubenswrapper[29612]: I0319 12:24:13.518812 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:13.519091 master-0 kubenswrapper[29612]: E0319 12:24:13.518891 29612 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 12:24:13.519091 master-0 kubenswrapper[29612]: E0319 12:24:13.519025 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:21.518994242 +0000 UTC m=+868.833163313 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "metrics-server-cert" not found Mar 19 12:24:13.519352 master-0 kubenswrapper[29612]: E0319 12:24:13.519130 29612 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 12:24:13.519352 master-0 kubenswrapper[29612]: E0319 12:24:13.519308 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:21.51927837 +0000 UTC m=+868.833447421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "webhook-server-cert" not found Mar 19 12:24:20.683681 master-0 kubenswrapper[29612]: I0319 12:24:20.683607 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:20.688457 master-0 kubenswrapper[29612]: I0319 12:24:20.688406 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7398e347-55cd-481c-9240-27e7e6a8e8ba-cert\") pod \"infra-operator-controller-manager-7dd6bb94c9-hvsqq\" (UID: \"7398e347-55cd-481c-9240-27e7e6a8e8ba\") " pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:20.924118 master-0 kubenswrapper[29612]: I0319 12:24:20.924055 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:21.294915 master-0 kubenswrapper[29612]: I0319 12:24:21.294870 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:21.295120 master-0 kubenswrapper[29612]: E0319 12:24:21.295064 29612 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:21.295174 master-0 kubenswrapper[29612]: E0319 12:24:21.295122 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert podName:52c88b85-7e59-4c34-aaa6-47cbbb966eea nodeName:}" failed. No retries permitted until 2026-03-19 12:24:37.295106943 +0000 UTC m=+884.609275964 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert") pod "openstack-baremetal-operator-controller-manager-74c4796899mz9v2" (UID: "52c88b85-7e59-4c34-aaa6-47cbbb966eea") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 12:24:21.603999 master-0 kubenswrapper[29612]: I0319 12:24:21.603704 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:21.603999 master-0 kubenswrapper[29612]: I0319 12:24:21.603812 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:21.603999 master-0 kubenswrapper[29612]: E0319 12:24:21.603996 29612 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 12:24:21.604286 master-0 kubenswrapper[29612]: E0319 12:24:21.604044 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:37.604031221 +0000 UTC m=+884.918200242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "webhook-server-cert" not found Mar 19 12:24:21.604286 master-0 kubenswrapper[29612]: E0319 12:24:21.604182 29612 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 12:24:21.604286 master-0 kubenswrapper[29612]: E0319 12:24:21.604243 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs podName:b7def2b1-f191-4b0c-b4dd-2f9a03efad96 nodeName:}" failed. No retries permitted until 2026-03-19 12:24:37.604216606 +0000 UTC m=+884.918385617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs") pod "openstack-operator-controller-manager-86bd8996f6-7gvcm" (UID: "b7def2b1-f191-4b0c-b4dd-2f9a03efad96") : secret "metrics-server-cert" not found Mar 19 12:24:27.651091 master-0 kubenswrapper[29612]: I0319 12:24:27.651039 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq"] Mar 19 12:24:28.195113 master-0 kubenswrapper[29612]: I0319 12:24:28.193574 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" event={"ID":"9e3c1457-2f62-4ea9-8782-2cd197683d38","Type":"ContainerStarted","Data":"e3608ebeaab187e7e23159d6eab9e92046f0a09f597318c9495b47aaccc13e6e"} Mar 19 12:24:28.195113 master-0 kubenswrapper[29612]: I0319 12:24:28.194888 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:28.206008 master-0 kubenswrapper[29612]: I0319 12:24:28.205949 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" event={"ID":"d54e5e4f-98ea-4896-848f-108d2f56e796","Type":"ContainerStarted","Data":"5b834302394959548565dc5dc527d6e5708da3dfa2cb89a33828d88db91855c6"} Mar 19 12:24:28.206856 master-0 kubenswrapper[29612]: I0319 12:24:28.206829 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:28.212939 master-0 kubenswrapper[29612]: I0319 12:24:28.211958 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" event={"ID":"7398e347-55cd-481c-9240-27e7e6a8e8ba","Type":"ContainerStarted","Data":"35f5c8e7b43e053c7ecef8ac787a05ccdedfbb0043e021985f16f5169555f982"} Mar 19 12:24:28.218546 master-0 kubenswrapper[29612]: I0319 12:24:28.218469 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" event={"ID":"df072e70-d36d-4a5b-b289-7a8f6b3587cf","Type":"ContainerStarted","Data":"ccb9904bd6464069de598a1c22c93339dd1d05708708b52f0892433226da284b"} Mar 19 12:24:28.220232 master-0 kubenswrapper[29612]: I0319 12:24:28.219587 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:28.235503 master-0 kubenswrapper[29612]: I0319 12:24:28.235447 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" event={"ID":"6e25fc95-0475-4395-952b-59db3cd91ad7","Type":"ContainerStarted","Data":"7038e669102df06950245bc568861226427d56419a5f0c390b8858a1fd5d0e3b"} Mar 19 12:24:28.237886 master-0 kubenswrapper[29612]: I0319 12:24:28.237841 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" event={"ID":"fa2bb2e9-5b88-4c4d-8abe-2837c40e5880","Type":"ContainerStarted","Data":"608bd6fd438b095a722e210cfec58f989a0c90213cf7bccab70a7e8eaa8c2820"} Mar 19 12:24:28.238805 master-0 kubenswrapper[29612]: I0319 12:24:28.238752 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:28.240227 master-0 kubenswrapper[29612]: I0319 12:24:28.240201 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" event={"ID":"f95c7d97-fc0c-47a4-aeab-32258cb1ec8e","Type":"ContainerStarted","Data":"7aa94e142fa09b8a7194fc5fa7b8825e735c5cf864344d2df09912634b28d216"} Mar 19 12:24:28.240860 master-0 kubenswrapper[29612]: I0319 12:24:28.240826 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:28.243766 master-0 kubenswrapper[29612]: I0319 12:24:28.243670 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" event={"ID":"c3727fb0-72dd-4b34-bf14-5a972fe68e65","Type":"ContainerStarted","Data":"1f05152770816c5cb9f4784080facf7a7bfc447ee1097446cc0be5af25aa50cc"} Mar 19 12:24:28.243853 master-0 kubenswrapper[29612]: I0319 12:24:28.243784 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.246778 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" event={"ID":"e6648cf7-2ce1-4994-9edd-037386b885e7","Type":"ContainerStarted","Data":"68c062a5cd353101a92f59b71a55c09eac26e5f4f89b860cf3882b2eec47aabc"} Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.246884 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.253365 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" event={"ID":"85abdea2-0fc7-4cc3-a74f-0282cdc5d5d2","Type":"ContainerStarted","Data":"a7b9e0e6d445079ebb4e14966d69a3da3dbd8be3eba8451a9f95622bae39e24c"} Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.255216 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.258274 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" event={"ID":"35b6c38e-a5ea-47ad-ac78-dcb138979d96","Type":"ContainerStarted","Data":"3bfb40353958071cd9499d6d82cb45c3415e3325dcce5450c8e0fce97ee827e6"} Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.259138 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:28.260673 master-0 kubenswrapper[29612]: I0319 12:24:28.260557 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" event={"ID":"f77e9b9d-e1fe-411c-a005-be21205ebc29","Type":"ContainerStarted","Data":"d991f4084ba2295bfed62004d879d21ed2e3833d79cf55acc727efe6b40f50b5"} Mar 19 12:24:28.261189 master-0 kubenswrapper[29612]: I0319 12:24:28.261131 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:28.277724 master-0 kubenswrapper[29612]: I0319 12:24:28.273889 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" event={"ID":"8b552910-e16b-4d0f-a672-e102464441ee","Type":"ContainerStarted","Data":"c0aab446b785eeb0d66fd82a946c4018a7e469caabec24550621f2151ed4588f"} Mar 19 12:24:28.277724 master-0 kubenswrapper[29612]: I0319 12:24:28.275432 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:29.282920 master-0 kubenswrapper[29612]: I0319 12:24:29.282866 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" event={"ID":"903e6a5b-d8ec-4a23-bc47-8428e998422f","Type":"ContainerStarted","Data":"990a12dbdf73f27bc711c8bbe46b7cfb78051b2bee20149bf1d44cf44ad9e498"} Mar 19 12:24:30.301565 master-0 kubenswrapper[29612]: I0319 12:24:30.301477 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" event={"ID":"28006f2f-ee6b-46fe-a29e-7d9f140a2d1c","Type":"ContainerStarted","Data":"c20897bcc12b91120db89f5262f3b19fd8d6bdbbe26253fa94ace1391c66e493"} Mar 19 12:24:30.302196 master-0 kubenswrapper[29612]: I0319 12:24:30.301578 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:30.304464 master-0 kubenswrapper[29612]: I0319 12:24:30.304386 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" event={"ID":"7b6191f9-f638-49fc-9336-21a795a51dd8","Type":"ContainerStarted","Data":"5cfb96ea93429ea640b34db208e153fe493d721e216f6e3f32d9e7fc1c90d105"} Mar 19 12:24:30.304764 master-0 kubenswrapper[29612]: I0319 12:24:30.304618 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:30.308667 master-0 kubenswrapper[29612]: I0319 12:24:30.306256 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" event={"ID":"52482a1e-f47f-4516-800f-b3b839699cb8","Type":"ContainerStarted","Data":"ff16419b696264cf0419db5ee32b40cc04cb33f459d650df2fae3cb3ce4086b6"} Mar 19 12:24:30.308667 master-0 kubenswrapper[29612]: I0319 12:24:30.306760 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:30.309455 master-0 kubenswrapper[29612]: I0319 12:24:30.309407 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" event={"ID":"a63e044e-9f48-42a5-8bec-91ba8d69a724","Type":"ContainerStarted","Data":"8088095a6550e21c2dcfcda2b17c6cf88cedf7b902087be59e67908015da8339"} Mar 19 12:24:30.310246 master-0 kubenswrapper[29612]: I0319 12:24:30.310224 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:30.312794 master-0 kubenswrapper[29612]: I0319 12:24:30.312741 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" event={"ID":"0f3d31e4-66ef-4e51-a1d3-c2a3528f533c","Type":"ContainerStarted","Data":"0581baaed46e80ef96031080ae7eb95b301631898ede41f315f02ba374eb6431"} Mar 19 12:24:30.312883 master-0 kubenswrapper[29612]: I0319 12:24:30.312798 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:30.312883 master-0 kubenswrapper[29612]: I0319 12:24:30.312817 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:31.194667 master-0 kubenswrapper[29612]: I0319 12:24:31.189698 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" podStartSLOduration=7.847514317 podStartE2EDuration="27.189678596s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.674007878 +0000 UTC m=+854.988176899" lastFinishedPulling="2026-03-19 12:24:27.016172147 +0000 UTC m=+874.330341178" observedRunningTime="2026-03-19 12:24:30.645500432 +0000 UTC m=+877.959669453" watchObservedRunningTime="2026-03-19 12:24:31.189678596 +0000 UTC m=+878.503847617" Mar 19 12:24:34.743367 master-0 kubenswrapper[29612]: I0319 12:24:34.743321 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" Mar 19 12:24:34.781201 master-0 kubenswrapper[29612]: I0319 12:24:34.781116 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" Mar 19 12:24:34.805136 master-0 kubenswrapper[29612]: I0319 12:24:34.805052 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" Mar 19 12:24:34.849600 master-0 kubenswrapper[29612]: I0319 12:24:34.849520 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" Mar 19 12:24:34.966671 master-0 kubenswrapper[29612]: I0319 12:24:34.966262 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" Mar 19 12:24:35.013830 master-0 kubenswrapper[29612]: I0319 12:24:35.013628 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" Mar 19 12:24:35.175154 master-0 kubenswrapper[29612]: I0319 12:24:35.175098 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" Mar 19 12:24:35.285123 master-0 kubenswrapper[29612]: I0319 12:24:35.284987 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" Mar 19 12:24:35.327595 master-0 kubenswrapper[29612]: I0319 12:24:35.327490 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" Mar 19 12:24:35.340783 master-0 kubenswrapper[29612]: I0319 12:24:35.340733 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" Mar 19 12:24:35.798971 master-0 kubenswrapper[29612]: I0319 12:24:35.798917 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" Mar 19 12:24:35.802244 master-0 kubenswrapper[29612]: I0319 12:24:35.802201 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-4564q" Mar 19 12:24:35.804374 master-0 kubenswrapper[29612]: I0319 12:24:35.804328 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" Mar 19 12:24:35.807021 master-0 kubenswrapper[29612]: I0319 12:24:35.806984 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" Mar 19 12:24:35.810908 master-0 kubenswrapper[29612]: I0319 12:24:35.810855 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" Mar 19 12:24:36.010453 master-0 kubenswrapper[29612]: I0319 12:24:36.010408 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" Mar 19 12:24:36.103244 master-0 kubenswrapper[29612]: I0319 12:24:36.103068 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" Mar 19 12:24:36.530677 master-0 kubenswrapper[29612]: I0319 12:24:36.530578 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8sdwq" podStartSLOduration=11.725554077 podStartE2EDuration="32.530555955s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.209872802 +0000 UTC m=+853.524041813" lastFinishedPulling="2026-03-19 12:24:27.01487466 +0000 UTC m=+874.329043691" observedRunningTime="2026-03-19 12:24:36.526632154 +0000 UTC m=+883.840801205" watchObservedRunningTime="2026-03-19 12:24:36.530555955 +0000 UTC m=+883.844724996" Mar 19 12:24:36.533222 master-0 kubenswrapper[29612]: I0319 12:24:36.533109 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d6r29" podStartSLOduration=12.269081003 podStartE2EDuration="32.533081487s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.760582142 +0000 UTC m=+854.074751163" lastFinishedPulling="2026-03-19 12:24:27.024582626 +0000 UTC m=+874.338751647" observedRunningTime="2026-03-19 12:24:35.792121071 +0000 UTC m=+883.106290092" watchObservedRunningTime="2026-03-19 12:24:36.533081487 +0000 UTC m=+883.847250598" Mar 19 12:24:36.595888 master-0 kubenswrapper[29612]: I0319 12:24:36.595797 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8pjtv" podStartSLOduration=12.31818123 podStartE2EDuration="32.595779599s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.703351216 +0000 UTC m=+854.017520237" lastFinishedPulling="2026-03-19 12:24:26.980949585 +0000 UTC m=+874.295118606" observedRunningTime="2026-03-19 12:24:36.58630502 +0000 UTC m=+883.900474051" watchObservedRunningTime="2026-03-19 12:24:36.595779599 +0000 UTC m=+883.909948620" Mar 19 12:24:36.634869 master-0 kubenswrapper[29612]: I0319 12:24:36.634746 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-tm2wt" podStartSLOduration=12.363004862 podStartE2EDuration="32.634721596s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.743816195 +0000 UTC m=+854.057985216" lastFinishedPulling="2026-03-19 12:24:27.015532929 +0000 UTC m=+874.329701950" observedRunningTime="2026-03-19 12:24:36.628436957 +0000 UTC m=+883.942605968" watchObservedRunningTime="2026-03-19 12:24:36.634721596 +0000 UTC m=+883.948890647" Mar 19 12:24:36.662918 master-0 kubenswrapper[29612]: I0319 12:24:36.662430 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-6chjf" podStartSLOduration=12.408906416 podStartE2EDuration="32.662410352s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.763158775 +0000 UTC m=+854.077327796" lastFinishedPulling="2026-03-19 12:24:27.016662711 +0000 UTC m=+874.330831732" observedRunningTime="2026-03-19 12:24:36.657175554 +0000 UTC m=+883.971344595" watchObservedRunningTime="2026-03-19 12:24:36.662410352 +0000 UTC m=+883.976579373" Mar 19 12:24:36.681382 master-0 kubenswrapper[29612]: I0319 12:24:36.680995 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-k45mg" podStartSLOduration=12.628733454 podStartE2EDuration="32.68097234s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.963618722 +0000 UTC m=+854.277787733" lastFinishedPulling="2026-03-19 12:24:27.015857598 +0000 UTC m=+874.330026619" observedRunningTime="2026-03-19 12:24:36.674721252 +0000 UTC m=+883.988890273" watchObservedRunningTime="2026-03-19 12:24:36.68097234 +0000 UTC m=+883.995141361" Mar 19 12:24:36.705016 master-0 kubenswrapper[29612]: I0319 12:24:36.704892 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-8b6qs" podStartSLOduration=12.020650385 podStartE2EDuration="32.704870799s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.197670226 +0000 UTC m=+853.511839247" lastFinishedPulling="2026-03-19 12:24:26.88189064 +0000 UTC m=+874.196059661" observedRunningTime="2026-03-19 12:24:36.696870632 +0000 UTC m=+884.011039653" watchObservedRunningTime="2026-03-19 12:24:36.704870799 +0000 UTC m=+884.019039820" Mar 19 12:24:36.734065 master-0 kubenswrapper[29612]: I0319 12:24:36.733956 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4f7d7" podStartSLOduration=13.418945588 podStartE2EDuration="32.733932405s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.709734893 +0000 UTC m=+855.023903914" lastFinishedPulling="2026-03-19 12:24:27.02472171 +0000 UTC m=+874.338890731" observedRunningTime="2026-03-19 12:24:36.724203528 +0000 UTC m=+884.038372549" watchObservedRunningTime="2026-03-19 12:24:36.733932405 +0000 UTC m=+884.048101426" Mar 19 12:24:36.746688 master-0 kubenswrapper[29612]: I0319 12:24:36.746603 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-rsrkg" podStartSLOduration=12.667827654 podStartE2EDuration="32.746586064s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.937651454 +0000 UTC m=+854.251820475" lastFinishedPulling="2026-03-19 12:24:27.016409864 +0000 UTC m=+874.330578885" observedRunningTime="2026-03-19 12:24:36.740367808 +0000 UTC m=+884.054536829" watchObservedRunningTime="2026-03-19 12:24:36.746586064 +0000 UTC m=+884.060755085" Mar 19 12:24:36.772968 master-0 kubenswrapper[29612]: I0319 12:24:36.772335 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-b2rhn" podStartSLOduration=12.507824547 podStartE2EDuration="32.772311905s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.760279773 +0000 UTC m=+854.074448794" lastFinishedPulling="2026-03-19 12:24:27.024767131 +0000 UTC m=+874.338936152" observedRunningTime="2026-03-19 12:24:36.764938126 +0000 UTC m=+884.079107147" watchObservedRunningTime="2026-03-19 12:24:36.772311905 +0000 UTC m=+884.086480926" Mar 19 12:24:36.801837 master-0 kubenswrapper[29612]: I0319 12:24:36.801162 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-jhqlm" podStartSLOduration=12.502517173 podStartE2EDuration="31.801145135s" podCreationTimestamp="2026-03-19 12:24:05 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.72652724 +0000 UTC m=+855.040696261" lastFinishedPulling="2026-03-19 12:24:27.025155202 +0000 UTC m=+874.339324223" observedRunningTime="2026-03-19 12:24:36.783663708 +0000 UTC m=+884.097832729" watchObservedRunningTime="2026-03-19 12:24:36.801145135 +0000 UTC m=+884.115314156" Mar 19 12:24:36.815462 master-0 kubenswrapper[29612]: I0319 12:24:36.813808 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-txwhd" podStartSLOduration=12.540113275 podStartE2EDuration="32.813781894s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.743160347 +0000 UTC m=+854.057329368" lastFinishedPulling="2026-03-19 12:24:27.016828956 +0000 UTC m=+874.330997987" observedRunningTime="2026-03-19 12:24:36.799680543 +0000 UTC m=+884.113849564" watchObservedRunningTime="2026-03-19 12:24:36.813781894 +0000 UTC m=+884.127950955" Mar 19 12:24:36.823869 master-0 kubenswrapper[29612]: I0319 12:24:36.823221 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-r6x2p" podStartSLOduration=12.519276878 podStartE2EDuration="31.823199921s" podCreationTimestamp="2026-03-19 12:24:05 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.721362383 +0000 UTC m=+855.035531404" lastFinishedPulling="2026-03-19 12:24:27.025285426 +0000 UTC m=+874.339454447" observedRunningTime="2026-03-19 12:24:36.822543523 +0000 UTC m=+884.136712544" watchObservedRunningTime="2026-03-19 12:24:36.823199921 +0000 UTC m=+884.137368942" Mar 19 12:24:36.867851 master-0 kubenswrapper[29612]: I0319 12:24:36.866309 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-8llrg" podStartSLOduration=13.556437794 podStartE2EDuration="32.866289646s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.714194549 +0000 UTC m=+855.028363570" lastFinishedPulling="2026-03-19 12:24:27.024046401 +0000 UTC m=+874.338215422" observedRunningTime="2026-03-19 12:24:36.843142718 +0000 UTC m=+884.157311739" watchObservedRunningTime="2026-03-19 12:24:36.866289646 +0000 UTC m=+884.180458657" Mar 19 12:24:36.897290 master-0 kubenswrapper[29612]: I0319 12:24:36.897126 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-q4ldx" podStartSLOduration=18.245781932 podStartE2EDuration="32.897105402s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:05.687354034 +0000 UTC m=+853.001523055" lastFinishedPulling="2026-03-19 12:24:20.338677504 +0000 UTC m=+867.652846525" observedRunningTime="2026-03-19 12:24:36.86819644 +0000 UTC m=+884.182365461" watchObservedRunningTime="2026-03-19 12:24:36.897105402 +0000 UTC m=+884.211274433" Mar 19 12:24:36.900699 master-0 kubenswrapper[29612]: I0319 12:24:36.900606 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-p8kq8" podStartSLOduration=12.122965981 podStartE2EDuration="32.900588581s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:06.246980287 +0000 UTC m=+853.561149318" lastFinishedPulling="2026-03-19 12:24:27.024602897 +0000 UTC m=+874.338771918" observedRunningTime="2026-03-19 12:24:36.886615984 +0000 UTC m=+884.200785005" watchObservedRunningTime="2026-03-19 12:24:36.900588581 +0000 UTC m=+884.214757612" Mar 19 12:24:36.936863 master-0 kubenswrapper[29612]: I0319 12:24:36.936778 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-csxnc" podStartSLOduration=13.644814826 podStartE2EDuration="32.936760408s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.724898184 +0000 UTC m=+855.039067205" lastFinishedPulling="2026-03-19 12:24:27.016843766 +0000 UTC m=+874.331012787" observedRunningTime="2026-03-19 12:24:36.90548689 +0000 UTC m=+884.219655921" watchObservedRunningTime="2026-03-19 12:24:36.936760408 +0000 UTC m=+884.250929429" Mar 19 12:24:37.351617 master-0 kubenswrapper[29612]: I0319 12:24:37.351500 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:37.383730 master-0 kubenswrapper[29612]: I0319 12:24:37.383660 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52c88b85-7e59-4c34-aaa6-47cbbb966eea-cert\") pod \"openstack-baremetal-operator-controller-manager-74c4796899mz9v2\" (UID: \"52c88b85-7e59-4c34-aaa6-47cbbb966eea\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:37.548683 master-0 kubenswrapper[29612]: I0319 12:24:37.548599 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:37.663137 master-0 kubenswrapper[29612]: I0319 12:24:37.663083 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:37.663336 master-0 kubenswrapper[29612]: I0319 12:24:37.663179 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:37.679341 master-0 kubenswrapper[29612]: I0319 12:24:37.679304 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-webhook-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:37.679979 master-0 kubenswrapper[29612]: I0319 12:24:37.679877 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7def2b1-f191-4b0c-b4dd-2f9a03efad96-metrics-certs\") pod \"openstack-operator-controller-manager-86bd8996f6-7gvcm\" (UID: \"b7def2b1-f191-4b0c-b4dd-2f9a03efad96\") " pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:37.682278 master-0 kubenswrapper[29612]: I0319 12:24:37.682238 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:38.336654 master-0 kubenswrapper[29612]: I0319 12:24:38.336519 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2"] Mar 19 12:24:38.406356 master-0 kubenswrapper[29612]: I0319 12:24:38.406268 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" event={"ID":"7398e347-55cd-481c-9240-27e7e6a8e8ba","Type":"ContainerStarted","Data":"123bd1c3a6ec34580ac20474579b2356d988e0b9c2b3bbc7409841402b04abc9"} Mar 19 12:24:38.406595 master-0 kubenswrapper[29612]: I0319 12:24:38.406443 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:24:38.407862 master-0 kubenswrapper[29612]: I0319 12:24:38.407815 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" event={"ID":"c2716611-c098-41f2-8dd3-72e2bbac6dc9","Type":"ContainerStarted","Data":"aa999fd56b34e85a228f06f07e9ad61c53355364a3ba0e6fa9358063bddb037e"} Mar 19 12:24:38.408391 master-0 kubenswrapper[29612]: I0319 12:24:38.408352 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:38.410737 master-0 kubenswrapper[29612]: I0319 12:24:38.410621 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" event={"ID":"52c88b85-7e59-4c34-aaa6-47cbbb966eea","Type":"ContainerStarted","Data":"97b04cb87bf11e0ef1724a3cab200e32113729f4c17a13583a833a8a8be61371"} Mar 19 12:24:38.413797 master-0 kubenswrapper[29612]: I0319 12:24:38.413759 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" event={"ID":"aaef999b-7d65-4ae6-9009-0006983f05b9","Type":"ContainerStarted","Data":"d0c33e990437f827c6cc28ad1119edd908ff4c8fd8fc918a52745b1c0f74b59a"} Mar 19 12:24:38.413993 master-0 kubenswrapper[29612]: I0319 12:24:38.413977 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:38.428057 master-0 kubenswrapper[29612]: I0319 12:24:38.425973 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" podStartSLOduration=24.233396438 podStartE2EDuration="34.425951837s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:27.688821152 +0000 UTC m=+875.002990173" lastFinishedPulling="2026-03-19 12:24:37.881376551 +0000 UTC m=+885.195545572" observedRunningTime="2026-03-19 12:24:38.422335434 +0000 UTC m=+885.736504475" watchObservedRunningTime="2026-03-19 12:24:38.425951837 +0000 UTC m=+885.740120858" Mar 19 12:24:38.479935 master-0 kubenswrapper[29612]: I0319 12:24:38.479861 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" podStartSLOduration=3.404573938 podStartE2EDuration="33.479840988s" podCreationTimestamp="2026-03-19 12:24:05 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.74976283 +0000 UTC m=+855.063931851" lastFinishedPulling="2026-03-19 12:24:37.82502988 +0000 UTC m=+885.139198901" observedRunningTime="2026-03-19 12:24:38.454769815 +0000 UTC m=+885.768938846" watchObservedRunningTime="2026-03-19 12:24:38.479840988 +0000 UTC m=+885.794010009" Mar 19 12:24:38.504761 master-0 kubenswrapper[29612]: I0319 12:24:38.504708 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm"] Mar 19 12:24:38.508808 master-0 kubenswrapper[29612]: W0319 12:24:38.508747 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7def2b1_f191_4b0c_b4dd_2f9a03efad96.slice/crio-48813a7bbf79ff2de65b0b6a0b21143183dabdbfb7c60aecfcda4f521b90faf1 WatchSource:0}: Error finding container 48813a7bbf79ff2de65b0b6a0b21143183dabdbfb7c60aecfcda4f521b90faf1: Status 404 returned error can't find the container with id 48813a7bbf79ff2de65b0b6a0b21143183dabdbfb7c60aecfcda4f521b90faf1 Mar 19 12:24:38.510329 master-0 kubenswrapper[29612]: I0319 12:24:38.509952 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" podStartSLOduration=4.578796306 podStartE2EDuration="33.509937293s" podCreationTimestamp="2026-03-19 12:24:05 +0000 UTC" firstStartedPulling="2026-03-19 12:24:07.747806745 +0000 UTC m=+855.061975766" lastFinishedPulling="2026-03-19 12:24:36.678947732 +0000 UTC m=+883.993116753" observedRunningTime="2026-03-19 12:24:38.489539433 +0000 UTC m=+885.803708464" watchObservedRunningTime="2026-03-19 12:24:38.509937293 +0000 UTC m=+885.824106314" Mar 19 12:24:39.425429 master-0 kubenswrapper[29612]: I0319 12:24:39.425303 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" event={"ID":"b7def2b1-f191-4b0c-b4dd-2f9a03efad96","Type":"ContainerStarted","Data":"6f9797c3beedd040faf3d59d59f0779b26c60e7eda8c5ac8c8ee02b39446278c"} Mar 19 12:24:39.426071 master-0 kubenswrapper[29612]: I0319 12:24:39.426052 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" event={"ID":"b7def2b1-f191-4b0c-b4dd-2f9a03efad96","Type":"ContainerStarted","Data":"48813a7bbf79ff2de65b0b6a0b21143183dabdbfb7c60aecfcda4f521b90faf1"} Mar 19 12:24:40.435084 master-0 kubenswrapper[29612]: I0319 12:24:40.435000 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:42.948315 master-0 kubenswrapper[29612]: I0319 12:24:42.948190 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" podStartSLOduration=37.948171962 podStartE2EDuration="37.948171962s" podCreationTimestamp="2026-03-19 12:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:24:39.460327929 +0000 UTC m=+886.774496960" watchObservedRunningTime="2026-03-19 12:24:42.948171962 +0000 UTC m=+890.262340993" Mar 19 12:24:43.468926 master-0 kubenswrapper[29612]: I0319 12:24:43.468851 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" event={"ID":"52c88b85-7e59-4c34-aaa6-47cbbb966eea","Type":"ContainerStarted","Data":"c268be303ca3762e7eccae17720a63bf6024dd21df27592e341ca50bcd2252bd"} Mar 19 12:24:43.469221 master-0 kubenswrapper[29612]: I0319 12:24:43.468999 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:43.517607 master-0 kubenswrapper[29612]: I0319 12:24:43.517496 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" podStartSLOduration=35.55994548 podStartE2EDuration="39.51747799s" podCreationTimestamp="2026-03-19 12:24:04 +0000 UTC" firstStartedPulling="2026-03-19 12:24:38.353050575 +0000 UTC m=+885.667219596" lastFinishedPulling="2026-03-19 12:24:42.310583065 +0000 UTC m=+889.624752106" observedRunningTime="2026-03-19 12:24:43.50867855 +0000 UTC m=+890.822847591" watchObservedRunningTime="2026-03-19 12:24:43.51747799 +0000 UTC m=+890.831647011" Mar 19 12:24:45.879588 master-0 kubenswrapper[29612]: I0319 12:24:45.879497 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-5gc44" Mar 19 12:24:46.166649 master-0 kubenswrapper[29612]: I0319 12:24:46.166574 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-qgb4j" Mar 19 12:24:47.557239 master-0 kubenswrapper[29612]: I0319 12:24:47.557153 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-74c4796899mz9v2" Mar 19 12:24:47.689890 master-0 kubenswrapper[29612]: I0319 12:24:47.689822 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86bd8996f6-7gvcm" Mar 19 12:24:50.935784 master-0 kubenswrapper[29612]: I0319 12:24:50.935721 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7dd6bb94c9-hvsqq" Mar 19 12:25:32.657983 master-0 kubenswrapper[29612]: I0319 12:25:32.650690 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-j27hk"] Mar 19 12:25:32.657983 master-0 kubenswrapper[29612]: I0319 12:25:32.655016 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.667658 master-0 kubenswrapper[29612]: I0319 12:25:32.661045 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 12:25:32.667658 master-0 kubenswrapper[29612]: I0319 12:25:32.661253 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 12:25:32.667658 master-0 kubenswrapper[29612]: I0319 12:25:32.662121 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 12:25:32.681660 master-0 kubenswrapper[29612]: I0319 12:25:32.677232 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-j27hk"] Mar 19 12:25:32.731455 master-0 kubenswrapper[29612]: I0319 12:25:32.729929 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbgw\" (UniqueName: \"kubernetes.io/projected/ace422c6-bf8f-469e-872a-101565b54d40-kube-api-access-tzbgw\") pod \"dnsmasq-dns-685c76cf85-j27hk\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.731455 master-0 kubenswrapper[29612]: I0319 12:25:32.730098 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace422c6-bf8f-469e-872a-101565b54d40-config\") pod \"dnsmasq-dns-685c76cf85-j27hk\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.836519 master-0 kubenswrapper[29612]: I0319 12:25:32.835711 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace422c6-bf8f-469e-872a-101565b54d40-config\") pod \"dnsmasq-dns-685c76cf85-j27hk\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.836519 master-0 kubenswrapper[29612]: I0319 12:25:32.835831 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbgw\" (UniqueName: \"kubernetes.io/projected/ace422c6-bf8f-469e-872a-101565b54d40-kube-api-access-tzbgw\") pod \"dnsmasq-dns-685c76cf85-j27hk\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.837272 master-0 kubenswrapper[29612]: I0319 12:25:32.837229 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace422c6-bf8f-469e-872a-101565b54d40-config\") pod \"dnsmasq-dns-685c76cf85-j27hk\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.920723 master-0 kubenswrapper[29612]: I0319 12:25:32.919876 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbgw\" (UniqueName: \"kubernetes.io/projected/ace422c6-bf8f-469e-872a-101565b54d40-kube-api-access-tzbgw\") pod \"dnsmasq-dns-685c76cf85-j27hk\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:32.939524 master-0 kubenswrapper[29612]: I0319 12:25:32.939394 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-d7x4t"] Mar 19 12:25:32.942304 master-0 kubenswrapper[29612]: I0319 12:25:32.942261 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-d7x4t"] Mar 19 12:25:32.942390 master-0 kubenswrapper[29612]: I0319 12:25:32.942357 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:32.947037 master-0 kubenswrapper[29612]: I0319 12:25:32.946988 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 12:25:33.042931 master-0 kubenswrapper[29612]: I0319 12:25:33.042877 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:25:33.053924 master-0 kubenswrapper[29612]: I0319 12:25:33.053856 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.054152 master-0 kubenswrapper[29612]: I0319 12:25:33.053945 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4ncn\" (UniqueName: \"kubernetes.io/projected/bd298612-852e-41eb-911c-69ca7bb462d3-kube-api-access-t4ncn\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.054152 master-0 kubenswrapper[29612]: I0319 12:25:33.054017 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-config\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.168318 master-0 kubenswrapper[29612]: I0319 12:25:33.165755 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.168318 master-0 kubenswrapper[29612]: I0319 12:25:33.166604 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.168318 master-0 kubenswrapper[29612]: I0319 12:25:33.166699 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4ncn\" (UniqueName: \"kubernetes.io/projected/bd298612-852e-41eb-911c-69ca7bb462d3-kube-api-access-t4ncn\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.168318 master-0 kubenswrapper[29612]: I0319 12:25:33.166857 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-config\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.168318 master-0 kubenswrapper[29612]: I0319 12:25:33.167503 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-config\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.187069 master-0 kubenswrapper[29612]: I0319 12:25:33.186934 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4ncn\" (UniqueName: \"kubernetes.io/projected/bd298612-852e-41eb-911c-69ca7bb462d3-kube-api-access-t4ncn\") pod \"dnsmasq-dns-8476fd89bc-d7x4t\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.307490 master-0 kubenswrapper[29612]: I0319 12:25:33.307409 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:25:33.532376 master-0 kubenswrapper[29612]: I0319 12:25:33.532318 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-j27hk"] Mar 19 12:25:33.909497 master-0 kubenswrapper[29612]: I0319 12:25:33.909439 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-d7x4t"] Mar 19 12:25:33.996470 master-0 kubenswrapper[29612]: I0319 12:25:33.996390 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" event={"ID":"bd298612-852e-41eb-911c-69ca7bb462d3","Type":"ContainerStarted","Data":"5bb5919c059b0c4f2ad06e9b4bcbc75043f7fa5a5c0bb73859cbad5644e6b502"} Mar 19 12:25:33.997414 master-0 kubenswrapper[29612]: I0319 12:25:33.997338 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" event={"ID":"ace422c6-bf8f-469e-872a-101565b54d40","Type":"ContainerStarted","Data":"bfbe8a68b3f7356cd7f13bf4b12ab990b99f3b41d1b825463b575581b0718c77"} Mar 19 12:25:36.568662 master-0 kubenswrapper[29612]: I0319 12:25:36.568007 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-j27hk"] Mar 19 12:25:36.681665 master-0 kubenswrapper[29612]: I0319 12:25:36.681367 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-q5dpz"] Mar 19 12:25:36.683852 master-0 kubenswrapper[29612]: I0319 12:25:36.683130 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.699869 master-0 kubenswrapper[29612]: I0319 12:25:36.699043 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-q5dpz"] Mar 19 12:25:36.795476 master-0 kubenswrapper[29612]: I0319 12:25:36.788944 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-config\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.795476 master-0 kubenswrapper[29612]: I0319 12:25:36.789183 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b585v\" (UniqueName: \"kubernetes.io/projected/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-kube-api-access-b585v\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.795476 master-0 kubenswrapper[29612]: I0319 12:25:36.789233 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-dns-svc\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.903729 master-0 kubenswrapper[29612]: I0319 12:25:36.894122 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b585v\" (UniqueName: \"kubernetes.io/projected/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-kube-api-access-b585v\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.903729 master-0 kubenswrapper[29612]: I0319 12:25:36.894245 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-dns-svc\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.903729 master-0 kubenswrapper[29612]: I0319 12:25:36.895056 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-config\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.903729 master-0 kubenswrapper[29612]: I0319 12:25:36.895187 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-dns-svc\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.907656 master-0 kubenswrapper[29612]: I0319 12:25:36.907455 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-config\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:36.938279 master-0 kubenswrapper[29612]: I0319 12:25:36.919109 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b585v\" (UniqueName: \"kubernetes.io/projected/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-kube-api-access-b585v\") pod \"dnsmasq-dns-586dbdbb8c-q5dpz\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:37.069319 master-0 kubenswrapper[29612]: I0319 12:25:37.069237 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-d7x4t"] Mar 19 12:25:37.105972 master-0 kubenswrapper[29612]: I0319 12:25:37.099997 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:25:37.135661 master-0 kubenswrapper[29612]: I0319 12:25:37.135572 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c"] Mar 19 12:25:37.140900 master-0 kubenswrapper[29612]: I0319 12:25:37.140136 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.187429 master-0 kubenswrapper[29612]: I0319 12:25:37.187174 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c"] Mar 19 12:25:37.241798 master-0 kubenswrapper[29612]: I0319 12:25:37.222800 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-config\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.241798 master-0 kubenswrapper[29612]: I0319 12:25:37.222865 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.241798 master-0 kubenswrapper[29612]: I0319 12:25:37.222967 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86zb\" (UniqueName: \"kubernetes.io/projected/9c08b55d-d47b-4321-9835-d08d01dc3799-kube-api-access-m86zb\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.324986 master-0 kubenswrapper[29612]: I0319 12:25:37.324929 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-config\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.325116 master-0 kubenswrapper[29612]: I0319 12:25:37.324997 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.325116 master-0 kubenswrapper[29612]: I0319 12:25:37.325100 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m86zb\" (UniqueName: \"kubernetes.io/projected/9c08b55d-d47b-4321-9835-d08d01dc3799-kube-api-access-m86zb\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.326093 master-0 kubenswrapper[29612]: I0319 12:25:37.326056 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-config\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.326186 master-0 kubenswrapper[29612]: I0319 12:25:37.326138 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.364532 master-0 kubenswrapper[29612]: I0319 12:25:37.364485 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m86zb\" (UniqueName: \"kubernetes.io/projected/9c08b55d-d47b-4321-9835-d08d01dc3799-kube-api-access-m86zb\") pod \"dnsmasq-dns-6ff8fd9d5c-6sv2c\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.494298 master-0 kubenswrapper[29612]: I0319 12:25:37.494176 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:25:37.781825 master-0 kubenswrapper[29612]: I0319 12:25:37.776974 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-q5dpz"] Mar 19 12:25:37.803825 master-0 kubenswrapper[29612]: W0319 12:25:37.803680 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87c38b5f_cd6c_4fd3_8697_b8a17f77fc61.slice/crio-cd585e403c6fcdbe877c1941fe7aa96e1e1ea40fe807a47ecc022d1737748acf WatchSource:0}: Error finding container cd585e403c6fcdbe877c1941fe7aa96e1e1ea40fe807a47ecc022d1737748acf: Status 404 returned error can't find the container with id cd585e403c6fcdbe877c1941fe7aa96e1e1ea40fe807a47ecc022d1737748acf Mar 19 12:25:38.086186 master-0 kubenswrapper[29612]: I0319 12:25:38.086033 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" event={"ID":"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61","Type":"ContainerStarted","Data":"cd585e403c6fcdbe877c1941fe7aa96e1e1ea40fe807a47ecc022d1737748acf"} Mar 19 12:25:38.248522 master-0 kubenswrapper[29612]: I0319 12:25:38.248465 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c"] Mar 19 12:25:39.107275 master-0 kubenswrapper[29612]: I0319 12:25:39.107214 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" event={"ID":"9c08b55d-d47b-4321-9835-d08d01dc3799","Type":"ContainerStarted","Data":"33332d32c56820ccb83e8bd56dc07cd862d00a15ffef8601e010f7ff3220d8d0"} Mar 19 12:25:39.675739 master-0 kubenswrapper[29612]: I0319 12:25:39.674848 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 12:25:39.676518 master-0 kubenswrapper[29612]: I0319 12:25:39.676401 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 12:25:39.694117 master-0 kubenswrapper[29612]: I0319 12:25:39.694046 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 12:25:39.695947 master-0 kubenswrapper[29612]: I0319 12:25:39.695886 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 12:25:39.698544 master-0 kubenswrapper[29612]: I0319 12:25:39.698455 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 12:25:39.763567 master-0 kubenswrapper[29612]: I0319 12:25:39.763494 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 12:25:39.802246 master-0 kubenswrapper[29612]: I0319 12:25:39.802054 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1013fb54-2e92-43cf-85bd-28e2bc2f7852-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.802246 master-0 kubenswrapper[29612]: I0319 12:25:39.802131 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1013fb54-2e92-43cf-85bd-28e2bc2f7852-kolla-config\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.802246 master-0 kubenswrapper[29612]: I0319 12:25:39.802192 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1013fb54-2e92-43cf-85bd-28e2bc2f7852-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.802590 master-0 kubenswrapper[29612]: I0319 12:25:39.802280 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1013fb54-2e92-43cf-85bd-28e2bc2f7852-config-data\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.802590 master-0 kubenswrapper[29612]: I0319 12:25:39.802303 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlk9s\" (UniqueName: \"kubernetes.io/projected/1013fb54-2e92-43cf-85bd-28e2bc2f7852-kube-api-access-vlk9s\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.903870 master-0 kubenswrapper[29612]: I0319 12:25:39.903787 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1013fb54-2e92-43cf-85bd-28e2bc2f7852-config-data\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.903870 master-0 kubenswrapper[29612]: I0319 12:25:39.903842 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlk9s\" (UniqueName: \"kubernetes.io/projected/1013fb54-2e92-43cf-85bd-28e2bc2f7852-kube-api-access-vlk9s\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.905782 master-0 kubenswrapper[29612]: I0319 12:25:39.905743 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1013fb54-2e92-43cf-85bd-28e2bc2f7852-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.905859 master-0 kubenswrapper[29612]: I0319 12:25:39.905846 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1013fb54-2e92-43cf-85bd-28e2bc2f7852-kolla-config\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.906006 master-0 kubenswrapper[29612]: I0319 12:25:39.905965 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1013fb54-2e92-43cf-85bd-28e2bc2f7852-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.906620 master-0 kubenswrapper[29612]: I0319 12:25:39.906594 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1013fb54-2e92-43cf-85bd-28e2bc2f7852-kolla-config\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.906620 master-0 kubenswrapper[29612]: I0319 12:25:39.906612 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1013fb54-2e92-43cf-85bd-28e2bc2f7852-config-data\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.924175 master-0 kubenswrapper[29612]: I0319 12:25:39.924104 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1013fb54-2e92-43cf-85bd-28e2bc2f7852-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.925540 master-0 kubenswrapper[29612]: I0319 12:25:39.925492 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlk9s\" (UniqueName: \"kubernetes.io/projected/1013fb54-2e92-43cf-85bd-28e2bc2f7852-kube-api-access-vlk9s\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:39.938247 master-0 kubenswrapper[29612]: I0319 12:25:39.938107 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1013fb54-2e92-43cf-85bd-28e2bc2f7852-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1013fb54-2e92-43cf-85bd-28e2bc2f7852\") " pod="openstack/memcached-0" Mar 19 12:25:40.014668 master-0 kubenswrapper[29612]: I0319 12:25:40.013933 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 12:25:41.224359 master-0 kubenswrapper[29612]: I0319 12:25:41.222349 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 12:25:41.227432 master-0 kubenswrapper[29612]: I0319 12:25:41.227392 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.242955 master-0 kubenswrapper[29612]: I0319 12:25:41.242744 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 12:25:41.245394 master-0 kubenswrapper[29612]: I0319 12:25:41.245338 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 12:25:41.245513 master-0 kubenswrapper[29612]: I0319 12:25:41.245440 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 12:25:41.248009 master-0 kubenswrapper[29612]: I0319 12:25:41.245561 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 12:25:41.248009 master-0 kubenswrapper[29612]: I0319 12:25:41.247784 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 12:25:41.257435 master-0 kubenswrapper[29612]: I0319 12:25:41.257346 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 12:25:41.269458 master-0 kubenswrapper[29612]: I0319 12:25:41.269153 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 12:25:41.345655 master-0 kubenswrapper[29612]: I0319 12:25:41.345400 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zgd\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-kube-api-access-m5zgd\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.345655 master-0 kubenswrapper[29612]: I0319 12:25:41.345483 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.345655 master-0 kubenswrapper[29612]: I0319 12:25:41.345523 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62bc26d9-fe8d-439e-b57f-111e0732b8f3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^532fd2c3-3df3-49e4-be2c-fbef9abf2e17\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.345655 master-0 kubenswrapper[29612]: I0319 12:25:41.345579 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346030 master-0 kubenswrapper[29612]: I0319 12:25:41.345720 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346030 master-0 kubenswrapper[29612]: I0319 12:25:41.345894 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346030 master-0 kubenswrapper[29612]: I0319 12:25:41.345936 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346030 master-0 kubenswrapper[29612]: I0319 12:25:41.345978 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346030 master-0 kubenswrapper[29612]: I0319 12:25:41.346005 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346249 master-0 kubenswrapper[29612]: I0319 12:25:41.346158 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.346301 master-0 kubenswrapper[29612]: I0319 12:25:41.346255 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-config-data\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452531 master-0 kubenswrapper[29612]: I0319 12:25:41.452454 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452531 master-0 kubenswrapper[29612]: I0319 12:25:41.452524 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452809 master-0 kubenswrapper[29612]: I0319 12:25:41.452569 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452809 master-0 kubenswrapper[29612]: I0319 12:25:41.452605 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-config-data\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452809 master-0 kubenswrapper[29612]: I0319 12:25:41.452707 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5zgd\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-kube-api-access-m5zgd\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452809 master-0 kubenswrapper[29612]: I0319 12:25:41.452738 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452809 master-0 kubenswrapper[29612]: I0319 12:25:41.452763 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62bc26d9-fe8d-439e-b57f-111e0732b8f3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^532fd2c3-3df3-49e4-be2c-fbef9abf2e17\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.452809 master-0 kubenswrapper[29612]: I0319 12:25:41.452807 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.453074 master-0 kubenswrapper[29612]: I0319 12:25:41.452839 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.453074 master-0 kubenswrapper[29612]: I0319 12:25:41.452886 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.453074 master-0 kubenswrapper[29612]: I0319 12:25:41.452911 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.458325 master-0 kubenswrapper[29612]: I0319 12:25:41.456833 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.458325 master-0 kubenswrapper[29612]: I0319 12:25:41.456834 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.458325 master-0 kubenswrapper[29612]: I0319 12:25:41.456982 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.460347 master-0 kubenswrapper[29612]: I0319 12:25:41.460291 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-config-data\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.460750 master-0 kubenswrapper[29612]: I0319 12:25:41.460603 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:25:41.460750 master-0 kubenswrapper[29612]: I0319 12:25:41.460652 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62bc26d9-fe8d-439e-b57f-111e0732b8f3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^532fd2c3-3df3-49e4-be2c-fbef9abf2e17\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/f16fd7afd4abe063783223fe4c5f7b3e0cbdd9a9795b2f9852ef1bd2f8636fdf/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.460750 master-0 kubenswrapper[29612]: I0319 12:25:41.460664 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.462792 master-0 kubenswrapper[29612]: I0319 12:25:41.462738 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.463278 master-0 kubenswrapper[29612]: I0319 12:25:41.463246 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.470566 master-0 kubenswrapper[29612]: I0319 12:25:41.470518 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.478751 master-0 kubenswrapper[29612]: I0319 12:25:41.478588 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:41.491768 master-0 kubenswrapper[29612]: I0319 12:25:41.491502 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5zgd\" (UniqueName: \"kubernetes.io/projected/ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f-kube-api-access-m5zgd\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:43.348609 master-0 kubenswrapper[29612]: I0319 12:25:43.348522 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 12:25:43.352365 master-0 kubenswrapper[29612]: I0319 12:25:43.352311 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.365851 master-0 kubenswrapper[29612]: I0319 12:25:43.365791 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 12:25:43.368487 master-0 kubenswrapper[29612]: I0319 12:25:43.368292 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 12:25:43.370346 master-0 kubenswrapper[29612]: I0319 12:25:43.368879 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 12:25:43.370346 master-0 kubenswrapper[29612]: I0319 12:25:43.369727 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 12:25:43.370551 master-0 kubenswrapper[29612]: I0319 12:25:43.370490 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 12:25:43.370551 master-0 kubenswrapper[29612]: I0319 12:25:43.370511 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 12:25:43.443620 master-0 kubenswrapper[29612]: I0319 12:25:43.442706 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 12:25:43.498113 master-0 kubenswrapper[29612]: I0319 12:25:43.498039 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngb47\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-kube-api-access-ngb47\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498126 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58f5d134-96c4-49cc-890c-06251f80d2dd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498147 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498178 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498209 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498227 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498267 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58f5d134-96c4-49cc-890c-06251f80d2dd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498304 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498339 master-0 kubenswrapper[29612]: I0319 12:25:43.498319 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498811 master-0 kubenswrapper[29612]: I0319 12:25:43.498352 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.498811 master-0 kubenswrapper[29612]: I0319 12:25:43.498412 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce75b151-0cb4-4262-9de1-62361a766f2d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c8bb091b-b17c-4bf2-9f09-7772f0f4c6f9\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.603372 master-0 kubenswrapper[29612]: I0319 12:25:43.603194 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.603772 master-0 kubenswrapper[29612]: I0319 12:25:43.603697 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce75b151-0cb4-4262-9de1-62361a766f2d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c8bb091b-b17c-4bf2-9f09-7772f0f4c6f9\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.603930 master-0 kubenswrapper[29612]: I0319 12:25:43.603892 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngb47\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-kube-api-access-ngb47\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604281 master-0 kubenswrapper[29612]: I0319 12:25:43.604027 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58f5d134-96c4-49cc-890c-06251f80d2dd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604281 master-0 kubenswrapper[29612]: I0319 12:25:43.604057 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604281 master-0 kubenswrapper[29612]: I0319 12:25:43.604098 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604281 master-0 kubenswrapper[29612]: I0319 12:25:43.604171 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604281 master-0 kubenswrapper[29612]: I0319 12:25:43.604195 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604281 master-0 kubenswrapper[29612]: I0319 12:25:43.604266 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58f5d134-96c4-49cc-890c-06251f80d2dd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604500 master-0 kubenswrapper[29612]: I0319 12:25:43.604356 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.604500 master-0 kubenswrapper[29612]: I0319 12:25:43.604385 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.606600 master-0 kubenswrapper[29612]: I0319 12:25:43.606442 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.607167 master-0 kubenswrapper[29612]: I0319 12:25:43.607124 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.608135 master-0 kubenswrapper[29612]: I0319 12:25:43.608095 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:25:43.608224 master-0 kubenswrapper[29612]: I0319 12:25:43.608149 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce75b151-0cb4-4262-9de1-62361a766f2d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c8bb091b-b17c-4bf2-9f09-7772f0f4c6f9\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/3903286a1d5c8e2e0cc836b6f2e65958ca55179be50d3143b5423b99633468b7/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.608285 master-0 kubenswrapper[29612]: I0319 12:25:43.608095 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.608843 master-0 kubenswrapper[29612]: I0319 12:25:43.608811 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.613168 master-0 kubenswrapper[29612]: I0319 12:25:43.613135 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.613519 master-0 kubenswrapper[29612]: I0319 12:25:43.613475 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/58f5d134-96c4-49cc-890c-06251f80d2dd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.614067 master-0 kubenswrapper[29612]: I0319 12:25:43.614026 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/58f5d134-96c4-49cc-890c-06251f80d2dd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.615458 master-0 kubenswrapper[29612]: I0319 12:25:43.615415 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/58f5d134-96c4-49cc-890c-06251f80d2dd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.632988 master-0 kubenswrapper[29612]: I0319 12:25:43.632933 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:43.634945 master-0 kubenswrapper[29612]: I0319 12:25:43.634871 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngb47\" (UniqueName: \"kubernetes.io/projected/58f5d134-96c4-49cc-890c-06251f80d2dd-kube-api-access-ngb47\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:44.127658 master-0 kubenswrapper[29612]: I0319 12:25:44.113626 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62bc26d9-fe8d-439e-b57f-111e0732b8f3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^532fd2c3-3df3-49e4-be2c-fbef9abf2e17\") pod \"rabbitmq-server-0\" (UID: \"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f\") " pod="openstack/rabbitmq-server-0" Mar 19 12:25:44.311139 master-0 kubenswrapper[29612]: I0319 12:25:44.298988 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 12:25:44.659387 master-0 kubenswrapper[29612]: I0319 12:25:44.659300 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 12:25:44.662112 master-0 kubenswrapper[29612]: I0319 12:25:44.662062 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 12:25:44.668336 master-0 kubenswrapper[29612]: I0319 12:25:44.668283 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 12:25:44.668514 master-0 kubenswrapper[29612]: I0319 12:25:44.668483 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 12:25:44.669835 master-0 kubenswrapper[29612]: I0319 12:25:44.669770 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 12:25:44.685710 master-0 kubenswrapper[29612]: I0319 12:25:44.685604 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 12:25:44.785824 master-0 kubenswrapper[29612]: I0319 12:25:44.785768 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-config-data-default\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.786040 master-0 kubenswrapper[29612]: I0319 12:25:44.785871 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42gz\" (UniqueName: \"kubernetes.io/projected/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-kube-api-access-c42gz\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.786040 master-0 kubenswrapper[29612]: I0319 12:25:44.785955 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-kolla-config\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.786040 master-0 kubenswrapper[29612]: I0319 12:25:44.786003 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.786040 master-0 kubenswrapper[29612]: I0319 12:25:44.786034 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6ba40fcf-3de7-46ea-9a72-8fe72c8d5512\" (UniqueName: \"kubernetes.io/csi/topolvm.io^33292899-7197-403b-9302-e37d8faccb76\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.786762 master-0 kubenswrapper[29612]: I0319 12:25:44.786388 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.787028 master-0 kubenswrapper[29612]: I0319 12:25:44.786932 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.787251 master-0 kubenswrapper[29612]: I0319 12:25:44.787201 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.889792 master-0 kubenswrapper[29612]: I0319 12:25:44.889724 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.889792 master-0 kubenswrapper[29612]: I0319 12:25:44.889798 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6ba40fcf-3de7-46ea-9a72-8fe72c8d5512\" (UniqueName: \"kubernetes.io/csi/topolvm.io^33292899-7197-403b-9302-e37d8faccb76\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890050 master-0 kubenswrapper[29612]: I0319 12:25:44.889827 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890050 master-0 kubenswrapper[29612]: I0319 12:25:44.889844 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890050 master-0 kubenswrapper[29612]: I0319 12:25:44.889862 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890050 master-0 kubenswrapper[29612]: I0319 12:25:44.889951 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-config-data-default\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890050 master-0 kubenswrapper[29612]: I0319 12:25:44.890000 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42gz\" (UniqueName: \"kubernetes.io/projected/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-kube-api-access-c42gz\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890050 master-0 kubenswrapper[29612]: I0319 12:25:44.890042 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-kolla-config\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.890826 master-0 kubenswrapper[29612]: I0319 12:25:44.890752 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-kolla-config\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.891271 master-0 kubenswrapper[29612]: I0319 12:25:44.891214 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.891407 master-0 kubenswrapper[29612]: I0319 12:25:44.891359 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.892471 master-0 kubenswrapper[29612]: I0319 12:25:44.892428 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-config-data-default\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.893201 master-0 kubenswrapper[29612]: I0319 12:25:44.893138 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:25:44.893263 master-0 kubenswrapper[29612]: I0319 12:25:44.893192 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6ba40fcf-3de7-46ea-9a72-8fe72c8d5512\" (UniqueName: \"kubernetes.io/csi/topolvm.io^33292899-7197-403b-9302-e37d8faccb76\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/8ad096faf73d9e0cf536564945d5b84fec650082c68d79719fbf2c5089391966/globalmount\"" pod="openstack/openstack-galera-0" Mar 19 12:25:44.910119 master-0 kubenswrapper[29612]: I0319 12:25:44.910000 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.916523 master-0 kubenswrapper[29612]: I0319 12:25:44.916445 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42gz\" (UniqueName: \"kubernetes.io/projected/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-kube-api-access-c42gz\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:44.922174 master-0 kubenswrapper[29612]: I0319 12:25:44.922127 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8fe68fc-3f50-4352-b9b5-c2b597c7d88c-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:45.231043 master-0 kubenswrapper[29612]: I0319 12:25:45.230929 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zb6lv"] Mar 19 12:25:45.241396 master-0 kubenswrapper[29612]: I0319 12:25:45.241343 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.244134 master-0 kubenswrapper[29612]: I0319 12:25:45.244108 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 12:25:45.244474 master-0 kubenswrapper[29612]: I0319 12:25:45.244460 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 12:25:45.267906 master-0 kubenswrapper[29612]: I0319 12:25:45.267833 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zb6lv"] Mar 19 12:25:45.286271 master-0 kubenswrapper[29612]: I0319 12:25:45.286208 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8gh4l"] Mar 19 12:25:45.298649 master-0 kubenswrapper[29612]: I0319 12:25:45.298577 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.315045 master-0 kubenswrapper[29612]: I0319 12:25:45.314978 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8gh4l"] Mar 19 12:25:45.315947 master-0 kubenswrapper[29612]: I0319 12:25:45.315910 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-log-ovn\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.316009 master-0 kubenswrapper[29612]: I0319 12:25:45.315989 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl544\" (UniqueName: \"kubernetes.io/projected/a0c736a7-20fb-4ca6-9870-d5f978444dcc-kube-api-access-cl544\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.316114 master-0 kubenswrapper[29612]: I0319 12:25:45.316014 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-run-ovn\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.316114 master-0 kubenswrapper[29612]: I0319 12:25:45.316037 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0c736a7-20fb-4ca6-9870-d5f978444dcc-scripts\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.316114 master-0 kubenswrapper[29612]: I0319 12:25:45.316054 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0c736a7-20fb-4ca6-9870-d5f978444dcc-ovn-controller-tls-certs\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.316199 master-0 kubenswrapper[29612]: I0319 12:25:45.316115 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-run\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.316199 master-0 kubenswrapper[29612]: I0319 12:25:45.316147 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c736a7-20fb-4ca6-9870-d5f978444dcc-combined-ca-bundle\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418154 master-0 kubenswrapper[29612]: I0319 12:25:45.418088 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-run\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.418154 master-0 kubenswrapper[29612]: I0319 12:25:45.418152 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl544\" (UniqueName: \"kubernetes.io/projected/a0c736a7-20fb-4ca6-9870-d5f978444dcc-kube-api-access-cl544\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418154 master-0 kubenswrapper[29612]: I0319 12:25:45.418181 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-lib\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.418447 master-0 kubenswrapper[29612]: I0319 12:25:45.418279 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-run-ovn\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418447 master-0 kubenswrapper[29612]: I0319 12:25:45.418313 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-etc-ovs\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.418447 master-0 kubenswrapper[29612]: I0319 12:25:45.418402 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0c736a7-20fb-4ca6-9870-d5f978444dcc-scripts\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418542 master-0 kubenswrapper[29612]: I0319 12:25:45.418490 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0c736a7-20fb-4ca6-9870-d5f978444dcc-ovn-controller-tls-certs\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418817 master-0 kubenswrapper[29612]: I0319 12:25:45.418785 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-log\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.418903 master-0 kubenswrapper[29612]: I0319 12:25:45.418853 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-run\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418903 master-0 kubenswrapper[29612]: I0319 12:25:45.418866 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-run-ovn\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.418975 master-0 kubenswrapper[29612]: I0319 12:25:45.418923 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c736a7-20fb-4ca6-9870-d5f978444dcc-combined-ca-bundle\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.419628 master-0 kubenswrapper[29612]: I0319 12:25:45.419139 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-run\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.419628 master-0 kubenswrapper[29612]: I0319 12:25:45.419204 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-scripts\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.419628 master-0 kubenswrapper[29612]: I0319 12:25:45.419416 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-log-ovn\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.419628 master-0 kubenswrapper[29612]: I0319 12:25:45.419448 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhdp\" (UniqueName: \"kubernetes.io/projected/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-kube-api-access-kbhdp\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.419628 master-0 kubenswrapper[29612]: I0319 12:25:45.419582 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c736a7-20fb-4ca6-9870-d5f978444dcc-var-log-ovn\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.421279 master-0 kubenswrapper[29612]: I0319 12:25:45.421235 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0c736a7-20fb-4ca6-9870-d5f978444dcc-scripts\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.423697 master-0 kubenswrapper[29612]: I0319 12:25:45.422989 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c736a7-20fb-4ca6-9870-d5f978444dcc-combined-ca-bundle\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.436022 master-0 kubenswrapper[29612]: I0319 12:25:45.435919 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0c736a7-20fb-4ca6-9870-d5f978444dcc-ovn-controller-tls-certs\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.436022 master-0 kubenswrapper[29612]: I0319 12:25:45.436015 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl544\" (UniqueName: \"kubernetes.io/projected/a0c736a7-20fb-4ca6-9870-d5f978444dcc-kube-api-access-cl544\") pod \"ovn-controller-zb6lv\" (UID: \"a0c736a7-20fb-4ca6-9870-d5f978444dcc\") " pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.521616 master-0 kubenswrapper[29612]: I0319 12:25:45.521455 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-log\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.521616 master-0 kubenswrapper[29612]: I0319 12:25:45.521532 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-scripts\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.521959 master-0 kubenswrapper[29612]: I0319 12:25:45.521742 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhdp\" (UniqueName: \"kubernetes.io/projected/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-kube-api-access-kbhdp\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.521959 master-0 kubenswrapper[29612]: I0319 12:25:45.521743 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-log\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.521959 master-0 kubenswrapper[29612]: I0319 12:25:45.521840 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-run\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.521959 master-0 kubenswrapper[29612]: I0319 12:25:45.521862 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-lib\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.521959 master-0 kubenswrapper[29612]: I0319 12:25:45.521896 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-etc-ovs\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.522231 master-0 kubenswrapper[29612]: I0319 12:25:45.522003 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-run\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.522231 master-0 kubenswrapper[29612]: I0319 12:25:45.522145 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-etc-ovs\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.522358 master-0 kubenswrapper[29612]: I0319 12:25:45.522243 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-var-lib\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.523687 master-0 kubenswrapper[29612]: I0319 12:25:45.523656 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-scripts\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.541716 master-0 kubenswrapper[29612]: I0319 12:25:45.539999 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhdp\" (UniqueName: \"kubernetes.io/projected/9bdbf1ef-8fcf-420d-bf0f-62a694b14df8-kube-api-access-kbhdp\") pod \"ovn-controller-ovs-8gh4l\" (UID: \"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8\") " pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:45.579127 master-0 kubenswrapper[29612]: I0319 12:25:45.579061 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zb6lv" Mar 19 12:25:45.618064 master-0 kubenswrapper[29612]: I0319 12:25:45.617982 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:25:46.747207 master-0 kubenswrapper[29612]: I0319 12:25:46.743423 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce75b151-0cb4-4262-9de1-62361a766f2d\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c8bb091b-b17c-4bf2-9f09-7772f0f4c6f9\") pod \"rabbitmq-cell1-server-0\" (UID: \"58f5d134-96c4-49cc-890c-06251f80d2dd\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:47.842897 master-0 kubenswrapper[29612]: I0319 12:25:47.842672 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6ba40fcf-3de7-46ea-9a72-8fe72c8d5512\" (UniqueName: \"kubernetes.io/csi/topolvm.io^33292899-7197-403b-9302-e37d8faccb76\") pod \"openstack-galera-0\" (UID: \"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c\") " pod="openstack/openstack-galera-0" Mar 19 12:25:48.513450 master-0 kubenswrapper[29612]: I0319 12:25:48.513399 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:25:49.209934 master-0 kubenswrapper[29612]: I0319 12:25:49.209868 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 12:25:49.365972 master-0 kubenswrapper[29612]: I0319 12:25:49.365578 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 12:25:49.368153 master-0 kubenswrapper[29612]: I0319 12:25:49.367754 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.377231 master-0 kubenswrapper[29612]: I0319 12:25:49.377185 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 12:25:49.377455 master-0 kubenswrapper[29612]: I0319 12:25:49.377420 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 12:25:49.377990 master-0 kubenswrapper[29612]: I0319 12:25:49.377955 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 12:25:49.430667 master-0 kubenswrapper[29612]: I0319 12:25:49.430419 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 12:25:49.520100 master-0 kubenswrapper[29612]: I0319 12:25:49.519929 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62b6b3b-400d-4268-bc4f-cfaf95665cea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.520100 master-0 kubenswrapper[29612]: I0319 12:25:49.520013 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.521021 master-0 kubenswrapper[29612]: I0319 12:25:49.520197 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c62b6b3b-400d-4268-bc4f-cfaf95665cea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.521021 master-0 kubenswrapper[29612]: I0319 12:25:49.520312 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c62b6b3b-400d-4268-bc4f-cfaf95665cea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.521021 master-0 kubenswrapper[29612]: I0319 12:25:49.520399 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-498b3e0a-32b0-4bce-bf83-175f747f678b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ffc9290c-e802-4766-b449-89dd1838249f\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.521199 master-0 kubenswrapper[29612]: I0319 12:25:49.521135 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.521262 master-0 kubenswrapper[29612]: I0319 12:25:49.521205 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.521262 master-0 kubenswrapper[29612]: I0319 12:25:49.521228 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8w94\" (UniqueName: \"kubernetes.io/projected/c62b6b3b-400d-4268-bc4f-cfaf95665cea-kube-api-access-g8w94\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622558 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-498b3e0a-32b0-4bce-bf83-175f747f678b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ffc9290c-e802-4766-b449-89dd1838249f\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622720 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622748 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622765 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8w94\" (UniqueName: \"kubernetes.io/projected/c62b6b3b-400d-4268-bc4f-cfaf95665cea-kube-api-access-g8w94\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622833 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62b6b3b-400d-4268-bc4f-cfaf95665cea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622875 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622897 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c62b6b3b-400d-4268-bc4f-cfaf95665cea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.623485 master-0 kubenswrapper[29612]: I0319 12:25:49.622922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c62b6b3b-400d-4268-bc4f-cfaf95665cea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.624359 master-0 kubenswrapper[29612]: I0319 12:25:49.624321 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.625557 master-0 kubenswrapper[29612]: I0319 12:25:49.625497 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c62b6b3b-400d-4268-bc4f-cfaf95665cea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.626398 master-0 kubenswrapper[29612]: I0319 12:25:49.626359 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.626501 master-0 kubenswrapper[29612]: I0319 12:25:49.626399 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:25:49.626501 master-0 kubenswrapper[29612]: I0319 12:25:49.626427 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-498b3e0a-32b0-4bce-bf83-175f747f678b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ffc9290c-e802-4766-b449-89dd1838249f\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/461c93fb1e12865d055d92b543e68e1f41d0788daa2dfeaf06fed00beb82d7f9/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.627619 master-0 kubenswrapper[29612]: I0319 12:25:49.627474 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c62b6b3b-400d-4268-bc4f-cfaf95665cea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.631873 master-0 kubenswrapper[29612]: I0319 12:25:49.628110 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c62b6b3b-400d-4268-bc4f-cfaf95665cea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.631873 master-0 kubenswrapper[29612]: I0319 12:25:49.630091 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c62b6b3b-400d-4268-bc4f-cfaf95665cea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:49.645603 master-0 kubenswrapper[29612]: I0319 12:25:49.645554 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8w94\" (UniqueName: \"kubernetes.io/projected/c62b6b3b-400d-4268-bc4f-cfaf95665cea-kube-api-access-g8w94\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:50.836825 master-0 kubenswrapper[29612]: I0319 12:25:50.836753 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-498b3e0a-32b0-4bce-bf83-175f747f678b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ffc9290c-e802-4766-b449-89dd1838249f\") pod \"openstack-cell1-galera-0\" (UID: \"c62b6b3b-400d-4268-bc4f-cfaf95665cea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:50.916151 master-0 kubenswrapper[29612]: I0319 12:25:50.916089 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 12:25:50.976548 master-0 kubenswrapper[29612]: I0319 12:25:50.976486 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 12:25:50.978851 master-0 kubenswrapper[29612]: I0319 12:25:50.978393 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:50.980805 master-0 kubenswrapper[29612]: I0319 12:25:50.980680 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 12:25:50.980971 master-0 kubenswrapper[29612]: I0319 12:25:50.980962 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 12:25:50.981113 master-0 kubenswrapper[29612]: I0319 12:25:50.981090 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 12:25:50.981347 master-0 kubenswrapper[29612]: I0319 12:25:50.981315 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 12:25:51.001668 master-0 kubenswrapper[29612]: I0319 12:25:51.001583 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 12:25:51.164267 master-0 kubenswrapper[29612]: I0319 12:25:51.164206 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164295 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164326 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164351 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtrfn\" (UniqueName: \"kubernetes.io/projected/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-kube-api-access-jtrfn\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164381 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164396 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164427 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.164476 master-0 kubenswrapper[29612]: I0319 12:25:51.164460 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ef2eed4-398c-4cc0-878a-704f381459ab\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fc21302d-fc69-4770-b16e-6a10491b0acc\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267032 master-0 kubenswrapper[29612]: I0319 12:25:51.266905 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267032 master-0 kubenswrapper[29612]: I0319 12:25:51.266997 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267254 master-0 kubenswrapper[29612]: I0319 12:25:51.267035 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267254 master-0 kubenswrapper[29612]: I0319 12:25:51.267072 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtrfn\" (UniqueName: \"kubernetes.io/projected/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-kube-api-access-jtrfn\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267254 master-0 kubenswrapper[29612]: I0319 12:25:51.267105 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267254 master-0 kubenswrapper[29612]: I0319 12:25:51.267125 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267254 master-0 kubenswrapper[29612]: I0319 12:25:51.267156 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.267254 master-0 kubenswrapper[29612]: I0319 12:25:51.267186 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ef2eed4-398c-4cc0-878a-704f381459ab\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fc21302d-fc69-4770-b16e-6a10491b0acc\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.269472 master-0 kubenswrapper[29612]: I0319 12:25:51.269436 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.269572 master-0 kubenswrapper[29612]: I0319 12:25:51.269502 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.269992 master-0 kubenswrapper[29612]: I0319 12:25:51.269952 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.271606 master-0 kubenswrapper[29612]: I0319 12:25:51.271186 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:25:51.271606 master-0 kubenswrapper[29612]: I0319 12:25:51.271410 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.271778 master-0 kubenswrapper[29612]: I0319 12:25:51.271413 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ef2eed4-398c-4cc0-878a-704f381459ab\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fc21302d-fc69-4770-b16e-6a10491b0acc\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/86f51dd58abe3d66da32d9de3ce1cfcf9ab0400053b32e5f18c623f3f9fc945b/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.281956 master-0 kubenswrapper[29612]: I0319 12:25:51.280355 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.287141 master-0 kubenswrapper[29612]: I0319 12:25:51.287088 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:51.294960 master-0 kubenswrapper[29612]: I0319 12:25:51.294920 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtrfn\" (UniqueName: \"kubernetes.io/projected/a8bc9a3b-3a15-41cc-b646-316fbdf146a2-kube-api-access-jtrfn\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:52.438658 master-0 kubenswrapper[29612]: I0319 12:25:52.438552 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 12:25:52.440262 master-0 kubenswrapper[29612]: I0319 12:25:52.440216 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.444752 master-0 kubenswrapper[29612]: I0319 12:25:52.444480 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 12:25:52.444752 master-0 kubenswrapper[29612]: I0319 12:25:52.444487 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 12:25:52.446014 master-0 kubenswrapper[29612]: I0319 12:25:52.445844 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 12:25:52.473009 master-0 kubenswrapper[29612]: I0319 12:25:52.472561 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 12:25:52.596172 master-0 kubenswrapper[29612]: I0319 12:25:52.596111 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wttt\" (UniqueName: \"kubernetes.io/projected/79985401-37f5-4b30-b80a-4440cf5e6701-kube-api-access-8wttt\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596172 master-0 kubenswrapper[29612]: I0319 12:25:52.596188 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596428 master-0 kubenswrapper[29612]: I0319 12:25:52.596303 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79985401-37f5-4b30-b80a-4440cf5e6701-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596428 master-0 kubenswrapper[29612]: I0319 12:25:52.596340 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596500 master-0 kubenswrapper[29612]: I0319 12:25:52.596446 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79985401-37f5-4b30-b80a-4440cf5e6701-config\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596500 master-0 kubenswrapper[29612]: I0319 12:25:52.596481 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596560 master-0 kubenswrapper[29612]: I0319 12:25:52.596525 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79985401-37f5-4b30-b80a-4440cf5e6701-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.596593 master-0 kubenswrapper[29612]: I0319 12:25:52.596574 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-043c2b25-e1bb-478a-86f3-329df7beea52\" (UniqueName: \"kubernetes.io/csi/topolvm.io^13893cc4-225e-4e68-b3ba-9c1dc1ee84e1\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.621992 master-0 kubenswrapper[29612]: I0319 12:25:52.621919 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ef2eed4-398c-4cc0-878a-704f381459ab\" (UniqueName: \"kubernetes.io/csi/topolvm.io^fc21302d-fc69-4770-b16e-6a10491b0acc\") pod \"ovsdbserver-nb-0\" (UID: \"a8bc9a3b-3a15-41cc-b646-316fbdf146a2\") " pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:52.698844 master-0 kubenswrapper[29612]: I0319 12:25:52.698671 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79985401-37f5-4b30-b80a-4440cf5e6701-config\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.698844 master-0 kubenswrapper[29612]: I0319 12:25:52.698766 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.698844 master-0 kubenswrapper[29612]: I0319 12:25:52.698829 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79985401-37f5-4b30-b80a-4440cf5e6701-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.699179 master-0 kubenswrapper[29612]: I0319 12:25:52.698887 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-043c2b25-e1bb-478a-86f3-329df7beea52\" (UniqueName: \"kubernetes.io/csi/topolvm.io^13893cc4-225e-4e68-b3ba-9c1dc1ee84e1\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.699179 master-0 kubenswrapper[29612]: I0319 12:25:52.698956 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wttt\" (UniqueName: \"kubernetes.io/projected/79985401-37f5-4b30-b80a-4440cf5e6701-kube-api-access-8wttt\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.699179 master-0 kubenswrapper[29612]: I0319 12:25:52.698983 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.699179 master-0 kubenswrapper[29612]: I0319 12:25:52.699072 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79985401-37f5-4b30-b80a-4440cf5e6701-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.699179 master-0 kubenswrapper[29612]: I0319 12:25:52.699104 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.700651 master-0 kubenswrapper[29612]: I0319 12:25:52.700528 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79985401-37f5-4b30-b80a-4440cf5e6701-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.700651 master-0 kubenswrapper[29612]: I0319 12:25:52.700583 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79985401-37f5-4b30-b80a-4440cf5e6701-config\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.701090 master-0 kubenswrapper[29612]: I0319 12:25:52.701049 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79985401-37f5-4b30-b80a-4440cf5e6701-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.702567 master-0 kubenswrapper[29612]: I0319 12:25:52.702341 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:25:52.702567 master-0 kubenswrapper[29612]: I0319 12:25:52.702381 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-043c2b25-e1bb-478a-86f3-329df7beea52\" (UniqueName: \"kubernetes.io/csi/topolvm.io^13893cc4-225e-4e68-b3ba-9c1dc1ee84e1\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/b79b35d8330df377cc5145ea6bf69e74966df1f463cf0138ca0cf489e72f4f1f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.704721 master-0 kubenswrapper[29612]: I0319 12:25:52.704678 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.704983 master-0 kubenswrapper[29612]: I0319 12:25:52.704857 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.705373 master-0 kubenswrapper[29612]: I0319 12:25:52.705341 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/79985401-37f5-4b30-b80a-4440cf5e6701-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.725558 master-0 kubenswrapper[29612]: I0319 12:25:52.720323 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wttt\" (UniqueName: \"kubernetes.io/projected/79985401-37f5-4b30-b80a-4440cf5e6701-kube-api-access-8wttt\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:52.823831 master-0 kubenswrapper[29612]: I0319 12:25:52.823755 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 12:25:54.050119 master-0 kubenswrapper[29612]: I0319 12:25:54.050017 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-043c2b25-e1bb-478a-86f3-329df7beea52\" (UniqueName: \"kubernetes.io/csi/topolvm.io^13893cc4-225e-4e68-b3ba-9c1dc1ee84e1\") pod \"ovsdbserver-sb-0\" (UID: \"79985401-37f5-4b30-b80a-4440cf5e6701\") " pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:54.198915 master-0 kubenswrapper[29612]: I0319 12:25:54.198799 29612 trace.go:236] Trace[1966876898]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (19-Mar-2026 12:25:52.876) (total time: 1322ms): Mar 19 12:25:54.198915 master-0 kubenswrapper[29612]: Trace[1966876898]: [1.322351407s] [1.322351407s] END Mar 19 12:25:54.275844 master-0 kubenswrapper[29612]: I0319 12:25:54.275784 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 12:25:54.407658 master-0 kubenswrapper[29612]: I0319 12:25:54.407587 29612 trace.go:236] Trace[1259695072]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (19-Mar-2026 12:25:52.877) (total time: 1530ms): Mar 19 12:25:54.407658 master-0 kubenswrapper[29612]: Trace[1259695072]: [1.530228644s] [1.530228644s] END Mar 19 12:25:54.564452 master-0 kubenswrapper[29612]: I0319 12:25:54.564342 29612 trace.go:236] Trace[961005464]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (19-Mar-2026 12:25:52.878) (total time: 1685ms): Mar 19 12:25:54.564452 master-0 kubenswrapper[29612]: Trace[961005464]: [1.685931187s] [1.685931187s] END Mar 19 12:25:54.728038 master-0 kubenswrapper[29612]: I0319 12:25:54.727939 29612 trace.go:236] Trace[1527780467]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (19-Mar-2026 12:25:52.880) (total time: 1847ms): Mar 19 12:25:54.728038 master-0 kubenswrapper[29612]: Trace[1527780467]: [1.847029495s] [1.847029495s] END Mar 19 12:25:54.759365 master-0 kubenswrapper[29612]: I0319 12:25:54.759288 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 12:25:54.901816 master-0 kubenswrapper[29612]: I0319 12:25:54.901776 29612 trace.go:236] Trace[2046229965]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (19-Mar-2026 12:25:52.877) (total time: 2024ms): Mar 19 12:25:54.901816 master-0 kubenswrapper[29612]: Trace[2046229965]: [2.02419052s] [2.02419052s] END Mar 19 12:25:55.429715 master-0 kubenswrapper[29612]: I0319 12:25:55.426881 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f","Type":"ContainerStarted","Data":"b84a8371a294a6d3aa1eb4cb500dadd48c14f597792ca873e26737b99ef807d9"} Mar 19 12:25:56.945954 master-0 kubenswrapper[29612]: I0319 12:25:56.945885 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 12:25:56.956163 master-0 kubenswrapper[29612]: I0319 12:25:56.955070 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 12:25:56.964977 master-0 kubenswrapper[29612]: I0319 12:25:56.964920 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 12:25:56.973742 master-0 kubenswrapper[29612]: I0319 12:25:56.973605 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zb6lv"] Mar 19 12:25:56.984168 master-0 kubenswrapper[29612]: I0319 12:25:56.981494 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 12:25:57.141397 master-0 kubenswrapper[29612]: W0319 12:25:57.141342 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8fe68fc_3f50_4352_b9b5_c2b597c7d88c.slice/crio-6b270ffe6ed27d59d2d546c3b41bd161f207ebe8f5e68214aa8b62a6b43ba96a WatchSource:0}: Error finding container 6b270ffe6ed27d59d2d546c3b41bd161f207ebe8f5e68214aa8b62a6b43ba96a: Status 404 returned error can't find the container with id 6b270ffe6ed27d59d2d546c3b41bd161f207ebe8f5e68214aa8b62a6b43ba96a Mar 19 12:25:57.378552 master-0 kubenswrapper[29612]: I0319 12:25:57.378468 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 12:25:57.447137 master-0 kubenswrapper[29612]: I0319 12:25:57.447068 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"79985401-37f5-4b30-b80a-4440cf5e6701","Type":"ContainerStarted","Data":"7c182c006c37e25aed0058513ad42a32c5bffeb3b44cd936c16ed8aaede420a0"} Mar 19 12:25:57.448211 master-0 kubenswrapper[29612]: I0319 12:25:57.448168 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c","Type":"ContainerStarted","Data":"6b270ffe6ed27d59d2d546c3b41bd161f207ebe8f5e68214aa8b62a6b43ba96a"} Mar 19 12:25:57.449392 master-0 kubenswrapper[29612]: I0319 12:25:57.449321 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zb6lv" event={"ID":"a0c736a7-20fb-4ca6-9870-d5f978444dcc","Type":"ContainerStarted","Data":"8b37ca9c5f50019fedcf6d73c33dd12861218b06594102362e087f1e31e28691"} Mar 19 12:25:57.450574 master-0 kubenswrapper[29612]: I0319 12:25:57.450522 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58f5d134-96c4-49cc-890c-06251f80d2dd","Type":"ContainerStarted","Data":"0ce9ec5ce7a9ae1dc922e99e2fad16b3f8d5ba1a9904c61be7c52ee3934765b2"} Mar 19 12:25:57.459251 master-0 kubenswrapper[29612]: I0319 12:25:57.459004 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" event={"ID":"9c08b55d-d47b-4321-9835-d08d01dc3799","Type":"ContainerStarted","Data":"5f4fd95033bf35acda1314c4bb6b2b1a62bc32d5218b6b0f471397b28a36b49a"} Mar 19 12:25:57.465972 master-0 kubenswrapper[29612]: I0319 12:25:57.465887 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c62b6b3b-400d-4268-bc4f-cfaf95665cea","Type":"ContainerStarted","Data":"e059c1cf21ded7e553c2c7be87d2750dd9d933487dc82f92f15e7c5da67b6460"} Mar 19 12:25:57.469452 master-0 kubenswrapper[29612]: I0319 12:25:57.469324 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1013fb54-2e92-43cf-85bd-28e2bc2f7852","Type":"ContainerStarted","Data":"340887741cd70082a9bb55a86926b9f92d9b3c508a8f9f2baede999366b6aa4d"} Mar 19 12:25:58.079657 master-0 kubenswrapper[29612]: I0319 12:25:58.076654 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8gh4l"] Mar 19 12:25:58.483462 master-0 kubenswrapper[29612]: I0319 12:25:58.483401 29612 generic.go:334] "Generic (PLEG): container finished" podID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerID="5f4fd95033bf35acda1314c4bb6b2b1a62bc32d5218b6b0f471397b28a36b49a" exitCode=0 Mar 19 12:25:58.483721 master-0 kubenswrapper[29612]: I0319 12:25:58.483478 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" event={"ID":"9c08b55d-d47b-4321-9835-d08d01dc3799","Type":"ContainerDied","Data":"5f4fd95033bf35acda1314c4bb6b2b1a62bc32d5218b6b0f471397b28a36b49a"} Mar 19 12:25:58.487204 master-0 kubenswrapper[29612]: I0319 12:25:58.487174 29612 generic.go:334] "Generic (PLEG): container finished" podID="bd298612-852e-41eb-911c-69ca7bb462d3" containerID="9d06084ce739e1287818a3b246b994fe8729d6a8ab2a2bdb0e4843ae7f362009" exitCode=0 Mar 19 12:25:58.487307 master-0 kubenswrapper[29612]: I0319 12:25:58.487226 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" event={"ID":"bd298612-852e-41eb-911c-69ca7bb462d3","Type":"ContainerDied","Data":"9d06084ce739e1287818a3b246b994fe8729d6a8ab2a2bdb0e4843ae7f362009"} Mar 19 12:25:58.491800 master-0 kubenswrapper[29612]: I0319 12:25:58.491748 29612 generic.go:334] "Generic (PLEG): container finished" podID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerID="b94db5c02ecdad5a11e4a437095b82a4126508ce5344495e083709732424906f" exitCode=0 Mar 19 12:25:58.491893 master-0 kubenswrapper[29612]: I0319 12:25:58.491820 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" event={"ID":"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61","Type":"ContainerDied","Data":"b94db5c02ecdad5a11e4a437095b82a4126508ce5344495e083709732424906f"} Mar 19 12:25:58.494223 master-0 kubenswrapper[29612]: I0319 12:25:58.493667 29612 generic.go:334] "Generic (PLEG): container finished" podID="ace422c6-bf8f-469e-872a-101565b54d40" containerID="ea20ed9576f8f3f348d4b4e1c7c260b8ccf567192b2f855e1566b2f4039172af" exitCode=0 Mar 19 12:25:58.494223 master-0 kubenswrapper[29612]: I0319 12:25:58.493718 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" event={"ID":"ace422c6-bf8f-469e-872a-101565b54d40","Type":"ContainerDied","Data":"ea20ed9576f8f3f348d4b4e1c7c260b8ccf567192b2f855e1566b2f4039172af"} Mar 19 12:25:58.519414 master-0 kubenswrapper[29612]: I0319 12:25:58.518957 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8gh4l" event={"ID":"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8","Type":"ContainerStarted","Data":"d541a090daa47715390686ded819707d0d9220f266164a8a6ca4f825f542002c"} Mar 19 12:25:58.736766 master-0 kubenswrapper[29612]: I0319 12:25:58.736627 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 12:25:58.803403 master-0 kubenswrapper[29612]: E0319 12:25:58.803288 29612 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 12:25:58.803403 master-0 kubenswrapper[29612]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 12:25:58.803403 master-0 kubenswrapper[29612]: > podSandboxID="cd585e403c6fcdbe877c1941fe7aa96e1e1ea40fe807a47ecc022d1737748acf" Mar 19 12:25:58.803577 master-0 kubenswrapper[29612]: E0319 12:25:58.803471 29612 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 12:25:58.803577 master-0 kubenswrapper[29612]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbchf8h696h5ffh5cdh585hc5hbfh597h58dhfh554h67bh9bh5c9hfch7dh5fbhbbh567h78h669hf8h65dh55dh588h5ddh88h694h669h95h8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b585v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-586dbdbb8c-q5dpz_openstack(87c38b5f-cd6c-4fd3-8697-b8a17f77fc61): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 12:25:58.803577 master-0 kubenswrapper[29612]: > logger="UnhandledError" Mar 19 12:25:58.804722 master-0 kubenswrapper[29612]: E0319 12:25:58.804666 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" Mar 19 12:25:59.534630 master-0 kubenswrapper[29612]: I0319 12:25:59.534560 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a8bc9a3b-3a15-41cc-b646-316fbdf146a2","Type":"ContainerStarted","Data":"72d2c46d221c6d41a42ae614e71918c4741b85e6765c3edb28ce1cd7498ee747"} Mar 19 12:25:59.537295 master-0 kubenswrapper[29612]: I0319 12:25:59.537230 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" event={"ID":"9c08b55d-d47b-4321-9835-d08d01dc3799","Type":"ContainerStarted","Data":"e543aabb787accca962b1ce13a9ae7fa055651fa7949e3b54af3165d5e325571"} Mar 19 12:25:59.569092 master-0 kubenswrapper[29612]: I0319 12:25:59.568994 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" podStartSLOduration=3.621086693 podStartE2EDuration="22.568974437s" podCreationTimestamp="2026-03-19 12:25:37 +0000 UTC" firstStartedPulling="2026-03-19 12:25:38.264279851 +0000 UTC m=+945.578448862" lastFinishedPulling="2026-03-19 12:25:57.212167585 +0000 UTC m=+964.526336606" observedRunningTime="2026-03-19 12:25:59.561292759 +0000 UTC m=+966.875461800" watchObservedRunningTime="2026-03-19 12:25:59.568974437 +0000 UTC m=+966.883143458" Mar 19 12:26:00.565401 master-0 kubenswrapper[29612]: I0319 12:26:00.565328 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:26:00.836735 master-0 kubenswrapper[29612]: I0319 12:26:00.836602 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:26:00.881834 master-0 kubenswrapper[29612]: I0319 12:26:00.881779 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:26:00.929189 master-0 kubenswrapper[29612]: I0319 12:26:00.919162 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4ncn\" (UniqueName: \"kubernetes.io/projected/bd298612-852e-41eb-911c-69ca7bb462d3-kube-api-access-t4ncn\") pod \"bd298612-852e-41eb-911c-69ca7bb462d3\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " Mar 19 12:26:00.929189 master-0 kubenswrapper[29612]: I0319 12:26:00.919415 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-dns-svc\") pod \"bd298612-852e-41eb-911c-69ca7bb462d3\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " Mar 19 12:26:00.929189 master-0 kubenswrapper[29612]: I0319 12:26:00.919525 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-config\") pod \"bd298612-852e-41eb-911c-69ca7bb462d3\" (UID: \"bd298612-852e-41eb-911c-69ca7bb462d3\") " Mar 19 12:26:00.929189 master-0 kubenswrapper[29612]: I0319 12:26:00.925189 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd298612-852e-41eb-911c-69ca7bb462d3-kube-api-access-t4ncn" (OuterVolumeSpecName: "kube-api-access-t4ncn") pod "bd298612-852e-41eb-911c-69ca7bb462d3" (UID: "bd298612-852e-41eb-911c-69ca7bb462d3"). InnerVolumeSpecName "kube-api-access-t4ncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.959185 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-476pd"] Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: E0319 12:26:00.959775 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace422c6-bf8f-469e-872a-101565b54d40" containerName="init" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.959809 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace422c6-bf8f-469e-872a-101565b54d40" containerName="init" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: E0319 12:26:00.959843 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd298612-852e-41eb-911c-69ca7bb462d3" containerName="init" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.959869 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd298612-852e-41eb-911c-69ca7bb462d3" containerName="init" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.960156 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd298612-852e-41eb-911c-69ca7bb462d3" containerName="init" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.960184 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace422c6-bf8f-469e-872a-101565b54d40" containerName="init" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.960950 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-476pd"] Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.961029 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:00.966683 master-0 kubenswrapper[29612]: I0319 12:26:00.963081 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 12:26:00.974102 master-0 kubenswrapper[29612]: I0319 12:26:00.969597 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd298612-852e-41eb-911c-69ca7bb462d3" (UID: "bd298612-852e-41eb-911c-69ca7bb462d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:00.974102 master-0 kubenswrapper[29612]: I0319 12:26:00.971015 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-config" (OuterVolumeSpecName: "config") pod "bd298612-852e-41eb-911c-69ca7bb462d3" (UID: "bd298612-852e-41eb-911c-69ca7bb462d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:01.021502 master-0 kubenswrapper[29612]: I0319 12:26:01.021406 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace422c6-bf8f-469e-872a-101565b54d40-config\") pod \"ace422c6-bf8f-469e-872a-101565b54d40\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " Mar 19 12:26:01.021774 master-0 kubenswrapper[29612]: I0319 12:26:01.021584 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzbgw\" (UniqueName: \"kubernetes.io/projected/ace422c6-bf8f-469e-872a-101565b54d40-kube-api-access-tzbgw\") pod \"ace422c6-bf8f-469e-872a-101565b54d40\" (UID: \"ace422c6-bf8f-469e-872a-101565b54d40\") " Mar 19 12:26:01.022141 master-0 kubenswrapper[29612]: I0319 12:26:01.022106 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:01.022141 master-0 kubenswrapper[29612]: I0319 12:26:01.022128 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd298612-852e-41eb-911c-69ca7bb462d3-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:01.022141 master-0 kubenswrapper[29612]: I0319 12:26:01.022138 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4ncn\" (UniqueName: \"kubernetes.io/projected/bd298612-852e-41eb-911c-69ca7bb462d3-kube-api-access-t4ncn\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:01.045104 master-0 kubenswrapper[29612]: I0319 12:26:01.043853 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace422c6-bf8f-469e-872a-101565b54d40-kube-api-access-tzbgw" (OuterVolumeSpecName: "kube-api-access-tzbgw") pod "ace422c6-bf8f-469e-872a-101565b54d40" (UID: "ace422c6-bf8f-469e-872a-101565b54d40"). InnerVolumeSpecName "kube-api-access-tzbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:26:01.099962 master-0 kubenswrapper[29612]: I0319 12:26:01.099835 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace422c6-bf8f-469e-872a-101565b54d40-config" (OuterVolumeSpecName: "config") pod "ace422c6-bf8f-469e-872a-101565b54d40" (UID: "ace422c6-bf8f-469e-872a-101565b54d40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:01.128517 master-0 kubenswrapper[29612]: I0319 12:26:01.128364 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d20a75-a002-4f3a-845c-4178e1e305e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.128517 master-0 kubenswrapper[29612]: I0319 12:26:01.128595 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7d20a75-a002-4f3a-845c-4178e1e305e1-ovs-rundir\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.128955 master-0 kubenswrapper[29612]: I0319 12:26:01.128933 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7d20a75-a002-4f3a-845c-4178e1e305e1-ovn-rundir\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.129284 master-0 kubenswrapper[29612]: I0319 12:26:01.128987 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgkk\" (UniqueName: \"kubernetes.io/projected/e7d20a75-a002-4f3a-845c-4178e1e305e1-kube-api-access-vdgkk\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.129284 master-0 kubenswrapper[29612]: I0319 12:26:01.129027 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d20a75-a002-4f3a-845c-4178e1e305e1-combined-ca-bundle\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.134113 master-0 kubenswrapper[29612]: I0319 12:26:01.129413 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d20a75-a002-4f3a-845c-4178e1e305e1-config\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.134113 master-0 kubenswrapper[29612]: I0319 12:26:01.132549 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace422c6-bf8f-469e-872a-101565b54d40-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:01.134113 master-0 kubenswrapper[29612]: I0319 12:26:01.132566 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzbgw\" (UniqueName: \"kubernetes.io/projected/ace422c6-bf8f-469e-872a-101565b54d40-kube-api-access-tzbgw\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:01.234791 master-0 kubenswrapper[29612]: I0319 12:26:01.234718 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7d20a75-a002-4f3a-845c-4178e1e305e1-ovn-rundir\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.234791 master-0 kubenswrapper[29612]: I0319 12:26:01.234784 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgkk\" (UniqueName: \"kubernetes.io/projected/e7d20a75-a002-4f3a-845c-4178e1e305e1-kube-api-access-vdgkk\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.234791 master-0 kubenswrapper[29612]: I0319 12:26:01.234804 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d20a75-a002-4f3a-845c-4178e1e305e1-combined-ca-bundle\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.235118 master-0 kubenswrapper[29612]: I0319 12:26:01.234868 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d20a75-a002-4f3a-845c-4178e1e305e1-config\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.235118 master-0 kubenswrapper[29612]: I0319 12:26:01.234963 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d20a75-a002-4f3a-845c-4178e1e305e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.235118 master-0 kubenswrapper[29612]: I0319 12:26:01.235000 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7d20a75-a002-4f3a-845c-4178e1e305e1-ovs-rundir\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.235249 master-0 kubenswrapper[29612]: I0319 12:26:01.235129 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e7d20a75-a002-4f3a-845c-4178e1e305e1-ovs-rundir\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.235249 master-0 kubenswrapper[29612]: I0319 12:26:01.235178 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e7d20a75-a002-4f3a-845c-4178e1e305e1-ovn-rundir\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.238978 master-0 kubenswrapper[29612]: I0319 12:26:01.237053 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7d20a75-a002-4f3a-845c-4178e1e305e1-config\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.243166 master-0 kubenswrapper[29612]: I0319 12:26:01.240277 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7d20a75-a002-4f3a-845c-4178e1e305e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.245311 master-0 kubenswrapper[29612]: I0319 12:26:01.245175 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d20a75-a002-4f3a-845c-4178e1e305e1-combined-ca-bundle\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.261451 master-0 kubenswrapper[29612]: I0319 12:26:01.261230 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgkk\" (UniqueName: \"kubernetes.io/projected/e7d20a75-a002-4f3a-845c-4178e1e305e1-kube-api-access-vdgkk\") pod \"ovn-controller-metrics-476pd\" (UID: \"e7d20a75-a002-4f3a-845c-4178e1e305e1\") " pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.285441 master-0 kubenswrapper[29612]: I0319 12:26:01.285308 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-476pd" Mar 19 12:26:01.366680 master-0 kubenswrapper[29612]: I0319 12:26:01.366539 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-q5dpz"] Mar 19 12:26:01.409960 master-0 kubenswrapper[29612]: I0319 12:26:01.409882 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5db7b98cb5-vf6p6"] Mar 19 12:26:01.411432 master-0 kubenswrapper[29612]: I0319 12:26:01.411399 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.424426 master-0 kubenswrapper[29612]: I0319 12:26:01.423777 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 12:26:01.480235 master-0 kubenswrapper[29612]: I0319 12:26:01.480163 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db7b98cb5-vf6p6"] Mar 19 12:26:01.550357 master-0 kubenswrapper[29612]: I0319 12:26:01.550199 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-dns-svc\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.550580 master-0 kubenswrapper[29612]: I0319 12:26:01.550553 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qgf\" (UniqueName: \"kubernetes.io/projected/c99754a5-0c68-414f-acd8-42b3d1257db6-kube-api-access-95qgf\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.550620 master-0 kubenswrapper[29612]: I0319 12:26:01.550592 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-config\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.550719 master-0 kubenswrapper[29612]: I0319 12:26:01.550699 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-ovsdbserver-nb\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.581457 master-0 kubenswrapper[29612]: I0319 12:26:01.581094 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1013fb54-2e92-43cf-85bd-28e2bc2f7852","Type":"ContainerStarted","Data":"0162b80a2aba58eeb1a6dc4b99a3aa72bfa676a0817f2a567f7d8f0b1693cb95"} Mar 19 12:26:01.581457 master-0 kubenswrapper[29612]: I0319 12:26:01.581344 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 12:26:01.584010 master-0 kubenswrapper[29612]: I0319 12:26:01.583982 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" Mar 19 12:26:01.584646 master-0 kubenswrapper[29612]: I0319 12:26:01.584310 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-d7x4t" event={"ID":"bd298612-852e-41eb-911c-69ca7bb462d3","Type":"ContainerDied","Data":"5bb5919c059b0c4f2ad06e9b4bcbc75043f7fa5a5c0bb73859cbad5644e6b502"} Mar 19 12:26:01.584646 master-0 kubenswrapper[29612]: I0319 12:26:01.584361 29612 scope.go:117] "RemoveContainer" containerID="9d06084ce739e1287818a3b246b994fe8729d6a8ab2a2bdb0e4843ae7f362009" Mar 19 12:26:01.588508 master-0 kubenswrapper[29612]: I0319 12:26:01.588470 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" event={"ID":"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61","Type":"ContainerStarted","Data":"c60c47965412e56fcde52c231971ff4ca93fd18cb7f0db3af23aa8228c53142c"} Mar 19 12:26:01.588743 master-0 kubenswrapper[29612]: I0319 12:26:01.588707 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="dnsmasq-dns" containerID="cri-o://c60c47965412e56fcde52c231971ff4ca93fd18cb7f0db3af23aa8228c53142c" gracePeriod=10 Mar 19 12:26:01.588826 master-0 kubenswrapper[29612]: I0319 12:26:01.588806 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:26:01.601773 master-0 kubenswrapper[29612]: I0319 12:26:01.600420 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" Mar 19 12:26:01.601773 master-0 kubenswrapper[29612]: I0319 12:26:01.601022 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-j27hk" event={"ID":"ace422c6-bf8f-469e-872a-101565b54d40","Type":"ContainerDied","Data":"bfbe8a68b3f7356cd7f13bf4b12ab990b99f3b41d1b825463b575581b0718c77"} Mar 19 12:26:01.648733 master-0 kubenswrapper[29612]: I0319 12:26:01.637691 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.093170929 podStartE2EDuration="22.637668042s" podCreationTimestamp="2026-03-19 12:25:39 +0000 UTC" firstStartedPulling="2026-03-19 12:25:57.158341625 +0000 UTC m=+964.472510646" lastFinishedPulling="2026-03-19 12:26:00.702838748 +0000 UTC m=+968.017007759" observedRunningTime="2026-03-19 12:26:01.626001101 +0000 UTC m=+968.940170112" watchObservedRunningTime="2026-03-19 12:26:01.637668042 +0000 UTC m=+968.951837063" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.652925 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qgf\" (UniqueName: \"kubernetes.io/projected/c99754a5-0c68-414f-acd8-42b3d1257db6-kube-api-access-95qgf\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.652986 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-config\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.653089 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-ovsdbserver-nb\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.653148 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-dns-svc\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.655521 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-config\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.657331 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-ovsdbserver-nb\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.660728 master-0 kubenswrapper[29612]: I0319 12:26:01.658429 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-dns-svc\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.685324 master-0 kubenswrapper[29612]: I0319 12:26:01.685264 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qgf\" (UniqueName: \"kubernetes.io/projected/c99754a5-0c68-414f-acd8-42b3d1257db6-kube-api-access-95qgf\") pod \"dnsmasq-dns-5db7b98cb5-vf6p6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.760388 master-0 kubenswrapper[29612]: I0319 12:26:01.759715 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:01.861230 master-0 kubenswrapper[29612]: I0319 12:26:01.861159 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c"] Mar 19 12:26:01.881776 master-0 kubenswrapper[29612]: I0319 12:26:01.874712 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" podStartSLOduration=6.118749541 podStartE2EDuration="25.874692908s" podCreationTimestamp="2026-03-19 12:25:36 +0000 UTC" firstStartedPulling="2026-03-19 12:25:37.80645139 +0000 UTC m=+945.120620411" lastFinishedPulling="2026-03-19 12:25:57.562394757 +0000 UTC m=+964.876563778" observedRunningTime="2026-03-19 12:26:01.827252179 +0000 UTC m=+969.141421200" watchObservedRunningTime="2026-03-19 12:26:01.874692908 +0000 UTC m=+969.188861929" Mar 19 12:26:02.234805 master-0 kubenswrapper[29612]: I0319 12:26:02.233435 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bc987d9f-ch8tp"] Mar 19 12:26:02.235126 master-0 kubenswrapper[29612]: I0319 12:26:02.235069 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.237908 master-0 kubenswrapper[29612]: I0319 12:26:02.237868 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 12:26:02.267527 master-0 kubenswrapper[29612]: I0319 12:26:02.267453 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bc987d9f-ch8tp"] Mar 19 12:26:02.322199 master-0 kubenswrapper[29612]: I0319 12:26:02.322123 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-d7x4t"] Mar 19 12:26:02.322199 master-0 kubenswrapper[29612]: I0319 12:26:02.322194 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-d7x4t"] Mar 19 12:26:02.401089 master-0 kubenswrapper[29612]: I0319 12:26:02.399209 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-nb\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.401089 master-0 kubenswrapper[29612]: I0319 12:26:02.399345 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-sb\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.401089 master-0 kubenswrapper[29612]: I0319 12:26:02.399481 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-dns-svc\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.401089 master-0 kubenswrapper[29612]: I0319 12:26:02.399561 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqgjr\" (UniqueName: \"kubernetes.io/projected/1f94d898-d378-4f9c-b25b-24709df3f010-kube-api-access-bqgjr\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.401089 master-0 kubenswrapper[29612]: I0319 12:26:02.399797 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-config\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.430385 master-0 kubenswrapper[29612]: I0319 12:26:02.429959 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-j27hk"] Mar 19 12:26:02.483661 master-0 kubenswrapper[29612]: I0319 12:26:02.475793 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-j27hk"] Mar 19 12:26:02.502199 master-0 kubenswrapper[29612]: I0319 12:26:02.502083 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-config\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.502199 master-0 kubenswrapper[29612]: I0319 12:26:02.502152 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-nb\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.502593 master-0 kubenswrapper[29612]: I0319 12:26:02.502234 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-sb\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.502593 master-0 kubenswrapper[29612]: I0319 12:26:02.502343 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-dns-svc\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.502593 master-0 kubenswrapper[29612]: I0319 12:26:02.502406 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqgjr\" (UniqueName: \"kubernetes.io/projected/1f94d898-d378-4f9c-b25b-24709df3f010-kube-api-access-bqgjr\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.503043 master-0 kubenswrapper[29612]: I0319 12:26:02.503015 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-config\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.503816 master-0 kubenswrapper[29612]: I0319 12:26:02.503713 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-sb\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.504792 master-0 kubenswrapper[29612]: I0319 12:26:02.504467 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-nb\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.504792 master-0 kubenswrapper[29612]: I0319 12:26:02.504743 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-dns-svc\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.528210 master-0 kubenswrapper[29612]: I0319 12:26:02.528118 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqgjr\" (UniqueName: \"kubernetes.io/projected/1f94d898-d378-4f9c-b25b-24709df3f010-kube-api-access-bqgjr\") pod \"dnsmasq-dns-57bc987d9f-ch8tp\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.567953 master-0 kubenswrapper[29612]: I0319 12:26:02.567893 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:02.615539 master-0 kubenswrapper[29612]: I0319 12:26:02.615394 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="dnsmasq-dns" containerID="cri-o://e543aabb787accca962b1ce13a9ae7fa055651fa7949e3b54af3165d5e325571" gracePeriod=10 Mar 19 12:26:02.920765 master-0 kubenswrapper[29612]: I0319 12:26:02.920696 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace422c6-bf8f-469e-872a-101565b54d40" path="/var/lib/kubelet/pods/ace422c6-bf8f-469e-872a-101565b54d40/volumes" Mar 19 12:26:02.921518 master-0 kubenswrapper[29612]: I0319 12:26:02.921483 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd298612-852e-41eb-911c-69ca7bb462d3" path="/var/lib/kubelet/pods/bd298612-852e-41eb-911c-69ca7bb462d3/volumes" Mar 19 12:26:03.906137 master-0 kubenswrapper[29612]: I0319 12:26:03.905183 29612 scope.go:117] "RemoveContainer" containerID="ea20ed9576f8f3f348d4b4e1c7c260b8ccf567192b2f855e1566b2f4039172af" Mar 19 12:26:04.645738 master-0 kubenswrapper[29612]: I0319 12:26:04.642930 29612 generic.go:334] "Generic (PLEG): container finished" podID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerID="c60c47965412e56fcde52c231971ff4ca93fd18cb7f0db3af23aa8228c53142c" exitCode=0 Mar 19 12:26:04.645738 master-0 kubenswrapper[29612]: I0319 12:26:04.643018 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" event={"ID":"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61","Type":"ContainerDied","Data":"c60c47965412e56fcde52c231971ff4ca93fd18cb7f0db3af23aa8228c53142c"} Mar 19 12:26:04.647009 master-0 kubenswrapper[29612]: I0319 12:26:04.646975 29612 generic.go:334] "Generic (PLEG): container finished" podID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerID="e543aabb787accca962b1ce13a9ae7fa055651fa7949e3b54af3165d5e325571" exitCode=0 Mar 19 12:26:04.647076 master-0 kubenswrapper[29612]: I0319 12:26:04.647018 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" event={"ID":"9c08b55d-d47b-4321-9835-d08d01dc3799","Type":"ContainerDied","Data":"e543aabb787accca962b1ce13a9ae7fa055651fa7949e3b54af3165d5e325571"} Mar 19 12:26:10.014876 master-0 kubenswrapper[29612]: I0319 12:26:10.014818 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 12:26:12.101335 master-0 kubenswrapper[29612]: I0319 12:26:12.101240 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.169:5353: i/o timeout" Mar 19 12:26:12.496566 master-0 kubenswrapper[29612]: I0319 12:26:12.496436 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.170:5353: i/o timeout" Mar 19 12:26:16.952282 master-0 kubenswrapper[29612]: I0319 12:26:16.952242 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:26:16.962501 master-0 kubenswrapper[29612]: I0319 12:26:16.962457 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:26:16.976179 master-0 kubenswrapper[29612]: I0319 12:26:16.976136 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b585v\" (UniqueName: \"kubernetes.io/projected/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-kube-api-access-b585v\") pod \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " Mar 19 12:26:16.976302 master-0 kubenswrapper[29612]: I0319 12:26:16.976251 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-dns-svc\") pod \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " Mar 19 12:26:16.976441 master-0 kubenswrapper[29612]: I0319 12:26:16.976393 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-config\") pod \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\" (UID: \"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61\") " Mar 19 12:26:16.979858 master-0 kubenswrapper[29612]: I0319 12:26:16.979790 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-kube-api-access-b585v" (OuterVolumeSpecName: "kube-api-access-b585v") pod "87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" (UID: "87c38b5f-cd6c-4fd3-8697-b8a17f77fc61"). InnerVolumeSpecName "kube-api-access-b585v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:26:17.030804 master-0 kubenswrapper[29612]: I0319 12:26:17.030613 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" (UID: "87c38b5f-cd6c-4fd3-8697-b8a17f77fc61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:17.032207 master-0 kubenswrapper[29612]: I0319 12:26:17.032141 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-config" (OuterVolumeSpecName: "config") pod "87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" (UID: "87c38b5f-cd6c-4fd3-8697-b8a17f77fc61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.078427 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-dns-svc\") pod \"9c08b55d-d47b-4321-9835-d08d01dc3799\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.078747 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-config\") pod \"9c08b55d-d47b-4321-9835-d08d01dc3799\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.078842 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m86zb\" (UniqueName: \"kubernetes.io/projected/9c08b55d-d47b-4321-9835-d08d01dc3799-kube-api-access-m86zb\") pod \"9c08b55d-d47b-4321-9835-d08d01dc3799\" (UID: \"9c08b55d-d47b-4321-9835-d08d01dc3799\") " Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.080000 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.080048 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b585v\" (UniqueName: \"kubernetes.io/projected/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-kube-api-access-b585v\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.080084 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:17.094732 master-0 kubenswrapper[29612]: I0319 12:26:17.083575 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c08b55d-d47b-4321-9835-d08d01dc3799-kube-api-access-m86zb" (OuterVolumeSpecName: "kube-api-access-m86zb") pod "9c08b55d-d47b-4321-9835-d08d01dc3799" (UID: "9c08b55d-d47b-4321-9835-d08d01dc3799"). InnerVolumeSpecName "kube-api-access-m86zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:26:17.102065 master-0 kubenswrapper[29612]: I0319 12:26:17.101876 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.169:5353: i/o timeout" Mar 19 12:26:17.137377 master-0 kubenswrapper[29612]: I0319 12:26:17.137293 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-config" (OuterVolumeSpecName: "config") pod "9c08b55d-d47b-4321-9835-d08d01dc3799" (UID: "9c08b55d-d47b-4321-9835-d08d01dc3799"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:17.140210 master-0 kubenswrapper[29612]: I0319 12:26:17.140144 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c08b55d-d47b-4321-9835-d08d01dc3799" (UID: "9c08b55d-d47b-4321-9835-d08d01dc3799"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:17.182240 master-0 kubenswrapper[29612]: I0319 12:26:17.182179 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:17.182240 master-0 kubenswrapper[29612]: I0319 12:26:17.182231 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m86zb\" (UniqueName: \"kubernetes.io/projected/9c08b55d-d47b-4321-9835-d08d01dc3799-kube-api-access-m86zb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:17.182240 master-0 kubenswrapper[29612]: I0319 12:26:17.182249 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c08b55d-d47b-4321-9835-d08d01dc3799-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:17.497004 master-0 kubenswrapper[29612]: I0319 12:26:17.496862 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.170:5353: i/o timeout" Mar 19 12:26:17.792321 master-0 kubenswrapper[29612]: I0319 12:26:17.792166 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" event={"ID":"9c08b55d-d47b-4321-9835-d08d01dc3799","Type":"ContainerDied","Data":"33332d32c56820ccb83e8bd56dc07cd862d00a15ffef8601e010f7ff3220d8d0"} Mar 19 12:26:17.792321 master-0 kubenswrapper[29612]: I0319 12:26:17.792249 29612 scope.go:117] "RemoveContainer" containerID="e543aabb787accca962b1ce13a9ae7fa055651fa7949e3b54af3165d5e325571" Mar 19 12:26:17.792592 master-0 kubenswrapper[29612]: I0319 12:26:17.792369 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c" Mar 19 12:26:17.799303 master-0 kubenswrapper[29612]: I0319 12:26:17.799270 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" event={"ID":"87c38b5f-cd6c-4fd3-8697-b8a17f77fc61","Type":"ContainerDied","Data":"cd585e403c6fcdbe877c1941fe7aa96e1e1ea40fe807a47ecc022d1737748acf"} Mar 19 12:26:17.799413 master-0 kubenswrapper[29612]: I0319 12:26:17.799350 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586dbdbb8c-q5dpz" Mar 19 12:26:18.535228 master-0 kubenswrapper[29612]: I0319 12:26:18.535167 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db7b98cb5-vf6p6"] Mar 19 12:26:23.309137 master-0 kubenswrapper[29612]: I0319 12:26:23.309039 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c"] Mar 19 12:26:23.974005 master-0 kubenswrapper[29612]: I0319 12:26:23.973927 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-6sv2c"] Mar 19 12:26:24.922438 master-0 kubenswrapper[29612]: I0319 12:26:24.922370 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" path="/var/lib/kubelet/pods/9c08b55d-d47b-4321-9835-d08d01dc3799/volumes" Mar 19 12:26:25.555978 master-0 kubenswrapper[29612]: I0319 12:26:25.555935 29612 scope.go:117] "RemoveContainer" containerID="5f4fd95033bf35acda1314c4bb6b2b1a62bc32d5218b6b0f471397b28a36b49a" Mar 19 12:26:25.907041 master-0 kubenswrapper[29612]: I0319 12:26:25.906963 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" event={"ID":"c99754a5-0c68-414f-acd8-42b3d1257db6","Type":"ContainerStarted","Data":"ef4d93828d5c267e0f1c59d83efcbf2c41d6caf1074c13149d23edfcf4daaae0"} Mar 19 12:26:26.693115 master-0 kubenswrapper[29612]: I0319 12:26:26.693000 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-q5dpz"] Mar 19 12:26:27.375776 master-0 kubenswrapper[29612]: I0319 12:26:27.375692 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586dbdbb8c-q5dpz"] Mar 19 12:26:28.350017 master-0 kubenswrapper[29612]: I0319 12:26:28.349906 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-476pd"] Mar 19 12:26:28.923038 master-0 kubenswrapper[29612]: I0319 12:26:28.922926 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" path="/var/lib/kubelet/pods/87c38b5f-cd6c-4fd3-8697-b8a17f77fc61/volumes" Mar 19 12:26:32.948319 master-0 kubenswrapper[29612]: W0319 12:26:32.948265 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d20a75_a002_4f3a_845c_4178e1e305e1.slice/crio-f466315bb1864ec516107e6d74835744215aa998763b2a18383cbdb0ddd5db97 WatchSource:0}: Error finding container f466315bb1864ec516107e6d74835744215aa998763b2a18383cbdb0ddd5db97: Status 404 returned error can't find the container with id f466315bb1864ec516107e6d74835744215aa998763b2a18383cbdb0ddd5db97 Mar 19 12:26:32.975516 master-0 kubenswrapper[29612]: I0319 12:26:32.975451 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-476pd" event={"ID":"e7d20a75-a002-4f3a-845c-4178e1e305e1","Type":"ContainerStarted","Data":"f466315bb1864ec516107e6d74835744215aa998763b2a18383cbdb0ddd5db97"} Mar 19 12:26:38.958255 master-0 kubenswrapper[29612]: I0319 12:26:38.957973 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bc987d9f-ch8tp"] Mar 19 12:26:40.004595 master-0 kubenswrapper[29612]: I0319 12:26:40.004539 29612 scope.go:117] "RemoveContainer" containerID="c60c47965412e56fcde52c231971ff4ca93fd18cb7f0db3af23aa8228c53142c" Mar 19 12:26:40.266655 master-0 kubenswrapper[29612]: I0319 12:26:40.266422 29612 scope.go:117] "RemoveContainer" containerID="b94db5c02ecdad5a11e4a437095b82a4126508ce5344495e083709732424906f" Mar 19 12:26:40.276722 master-0 kubenswrapper[29612]: I0319 12:26:40.276626 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bc987d9f-ch8tp"] Mar 19 12:26:40.354173 master-0 kubenswrapper[29612]: I0319 12:26:40.353902 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b8649b7f9-5jc6x"] Mar 19 12:26:40.354540 master-0 kubenswrapper[29612]: E0319 12:26:40.354513 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="dnsmasq-dns" Mar 19 12:26:40.354611 master-0 kubenswrapper[29612]: I0319 12:26:40.354550 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="dnsmasq-dns" Mar 19 12:26:40.354611 master-0 kubenswrapper[29612]: E0319 12:26:40.354567 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="init" Mar 19 12:26:40.354611 master-0 kubenswrapper[29612]: I0319 12:26:40.354573 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="init" Mar 19 12:26:40.354611 master-0 kubenswrapper[29612]: E0319 12:26:40.354588 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="dnsmasq-dns" Mar 19 12:26:40.354611 master-0 kubenswrapper[29612]: I0319 12:26:40.354594 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="dnsmasq-dns" Mar 19 12:26:40.354858 master-0 kubenswrapper[29612]: E0319 12:26:40.354649 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="init" Mar 19 12:26:40.354858 master-0 kubenswrapper[29612]: I0319 12:26:40.354656 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="init" Mar 19 12:26:40.355031 master-0 kubenswrapper[29612]: I0319 12:26:40.355007 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c38b5f-cd6c-4fd3-8697-b8a17f77fc61" containerName="dnsmasq-dns" Mar 19 12:26:40.355104 master-0 kubenswrapper[29612]: I0319 12:26:40.355075 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c08b55d-d47b-4321-9835-d08d01dc3799" containerName="dnsmasq-dns" Mar 19 12:26:40.356612 master-0 kubenswrapper[29612]: I0319 12:26:40.356586 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.432058 master-0 kubenswrapper[29612]: I0319 12:26:40.424542 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8649b7f9-5jc6x"] Mar 19 12:26:40.473366 master-0 kubenswrapper[29612]: I0319 12:26:40.470998 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.473366 master-0 kubenswrapper[29612]: I0319 12:26:40.471108 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-config\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.473366 master-0 kubenswrapper[29612]: I0319 12:26:40.471150 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwnph\" (UniqueName: \"kubernetes.io/projected/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-kube-api-access-hwnph\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.473366 master-0 kubenswrapper[29612]: I0319 12:26:40.471328 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-dns-svc\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.473366 master-0 kubenswrapper[29612]: I0319 12:26:40.471580 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.575452 master-0 kubenswrapper[29612]: I0319 12:26:40.575395 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.576221 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.576446 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-config\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.576542 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwnph\" (UniqueName: \"kubernetes.io/projected/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-kube-api-access-hwnph\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.576583 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.576663 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-dns-svc\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.577334 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-config\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.577531 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-dns-svc\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.579499 master-0 kubenswrapper[29612]: I0319 12:26:40.578094 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.623898 master-0 kubenswrapper[29612]: I0319 12:26:40.623702 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwnph\" (UniqueName: \"kubernetes.io/projected/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-kube-api-access-hwnph\") pod \"dnsmasq-dns-5b8649b7f9-5jc6x\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:40.872073 master-0 kubenswrapper[29612]: I0319 12:26:40.871424 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:41.166770 master-0 kubenswrapper[29612]: I0319 12:26:41.165416 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c62b6b3b-400d-4268-bc4f-cfaf95665cea","Type":"ContainerStarted","Data":"fc583ad987a3b4e90f11080039646c44f074ce9fca7feb31ecef6ea09f8154ca"} Mar 19 12:26:41.212227 master-0 kubenswrapper[29612]: I0319 12:26:41.210601 29612 generic.go:334] "Generic (PLEG): container finished" podID="1f94d898-d378-4f9c-b25b-24709df3f010" containerID="3dc4f50d2434fee1f750611d893939f822e12e31a1f9e28c8a2a135a0b0bcd40" exitCode=0 Mar 19 12:26:41.212227 master-0 kubenswrapper[29612]: I0319 12:26:41.210762 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" event={"ID":"1f94d898-d378-4f9c-b25b-24709df3f010","Type":"ContainerDied","Data":"3dc4f50d2434fee1f750611d893939f822e12e31a1f9e28c8a2a135a0b0bcd40"} Mar 19 12:26:41.212227 master-0 kubenswrapper[29612]: I0319 12:26:41.210808 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" event={"ID":"1f94d898-d378-4f9c-b25b-24709df3f010","Type":"ContainerStarted","Data":"100285a3c1582ab3157544386a8eb72fd29881136ee18d6e29e06ae51eeee774"} Mar 19 12:26:41.246895 master-0 kubenswrapper[29612]: I0319 12:26:41.244787 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"79985401-37f5-4b30-b80a-4440cf5e6701","Type":"ContainerStarted","Data":"18a325f47cca67876ffac73e53c82bcb66cf0b4e17c9c6debc562f6af9fb8142"} Mar 19 12:26:41.254334 master-0 kubenswrapper[29612]: I0319 12:26:41.248982 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c","Type":"ContainerStarted","Data":"d2d6b14e534f0aa6623d9c9d703a89b9a5909298f63b20e270eeea516b59fd17"} Mar 19 12:26:41.267757 master-0 kubenswrapper[29612]: I0319 12:26:41.267266 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8gh4l" event={"ID":"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8","Type":"ContainerStarted","Data":"15a1b0faf446029d2b9c85f56933495088d2368356c610315af9dbd3f1161f7d"} Mar 19 12:26:41.285866 master-0 kubenswrapper[29612]: I0319 12:26:41.279978 29612 generic.go:334] "Generic (PLEG): container finished" podID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerID="96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd" exitCode=0 Mar 19 12:26:41.285866 master-0 kubenswrapper[29612]: I0319 12:26:41.280036 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" event={"ID":"c99754a5-0c68-414f-acd8-42b3d1257db6","Type":"ContainerDied","Data":"96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd"} Mar 19 12:26:41.614616 master-0 kubenswrapper[29612]: I0319 12:26:41.614548 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b8649b7f9-5jc6x"] Mar 19 12:26:41.657446 master-0 kubenswrapper[29612]: W0319 12:26:41.656979 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2669f8bc_1bcb_4b9f_9c39_056e07e1a5a8.slice/crio-20adc537e0f04b9ff7df8d82b10e04c677fcc34b034cd6a94429b139f76ce202 WatchSource:0}: Error finding container 20adc537e0f04b9ff7df8d82b10e04c677fcc34b034cd6a94429b139f76ce202: Status 404 returned error can't find the container with id 20adc537e0f04b9ff7df8d82b10e04c677fcc34b034cd6a94429b139f76ce202 Mar 19 12:26:42.194097 master-0 kubenswrapper[29612]: I0319 12:26:42.193865 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:42.254773 master-0 kubenswrapper[29612]: I0319 12:26:42.254698 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-sb\") pod \"1f94d898-d378-4f9c-b25b-24709df3f010\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " Mar 19 12:26:42.255007 master-0 kubenswrapper[29612]: I0319 12:26:42.254840 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-nb\") pod \"1f94d898-d378-4f9c-b25b-24709df3f010\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " Mar 19 12:26:42.255007 master-0 kubenswrapper[29612]: I0319 12:26:42.254896 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-config\") pod \"1f94d898-d378-4f9c-b25b-24709df3f010\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " Mar 19 12:26:42.255007 master-0 kubenswrapper[29612]: I0319 12:26:42.254936 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-dns-svc\") pod \"1f94d898-d378-4f9c-b25b-24709df3f010\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " Mar 19 12:26:42.255111 master-0 kubenswrapper[29612]: I0319 12:26:42.255026 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqgjr\" (UniqueName: \"kubernetes.io/projected/1f94d898-d378-4f9c-b25b-24709df3f010-kube-api-access-bqgjr\") pod \"1f94d898-d378-4f9c-b25b-24709df3f010\" (UID: \"1f94d898-d378-4f9c-b25b-24709df3f010\") " Mar 19 12:26:42.288037 master-0 kubenswrapper[29612]: I0319 12:26:42.286955 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f94d898-d378-4f9c-b25b-24709df3f010-kube-api-access-bqgjr" (OuterVolumeSpecName: "kube-api-access-bqgjr") pod "1f94d898-d378-4f9c-b25b-24709df3f010" (UID: "1f94d898-d378-4f9c-b25b-24709df3f010"). InnerVolumeSpecName "kube-api-access-bqgjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:26:42.311435 master-0 kubenswrapper[29612]: I0319 12:26:42.311360 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a8bc9a3b-3a15-41cc-b646-316fbdf146a2","Type":"ContainerStarted","Data":"163dd1b48a184ea88d902e21462d016f602c9cdebadb1d91345ddc3cefb33dbc"} Mar 19 12:26:42.316748 master-0 kubenswrapper[29612]: I0319 12:26:42.316605 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58f5d134-96c4-49cc-890c-06251f80d2dd","Type":"ContainerStarted","Data":"7388821594d5d75b072d791cd49c86d3780d75e9772ce090b235857ae264f97f"} Mar 19 12:26:42.328618 master-0 kubenswrapper[29612]: I0319 12:26:42.328228 29612 generic.go:334] "Generic (PLEG): container finished" podID="9bdbf1ef-8fcf-420d-bf0f-62a694b14df8" containerID="15a1b0faf446029d2b9c85f56933495088d2368356c610315af9dbd3f1161f7d" exitCode=0 Mar 19 12:26:42.328618 master-0 kubenswrapper[29612]: I0319 12:26:42.328323 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8gh4l" event={"ID":"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8","Type":"ContainerDied","Data":"15a1b0faf446029d2b9c85f56933495088d2368356c610315af9dbd3f1161f7d"} Mar 19 12:26:42.334210 master-0 kubenswrapper[29612]: I0319 12:26:42.334143 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" event={"ID":"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8","Type":"ContainerStarted","Data":"20adc537e0f04b9ff7df8d82b10e04c677fcc34b034cd6a94429b139f76ce202"} Mar 19 12:26:42.343947 master-0 kubenswrapper[29612]: I0319 12:26:42.339860 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f","Type":"ContainerStarted","Data":"45e8823488586fa64a3e0ac2fe5a46d442bd682661bf3d10b0616d36e53f5f4c"} Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: E0319 12:26:42.354841 29612 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/c99754a5-0c68-414f-acd8-42b3d1257db6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: > podSandboxID="ef4d93828d5c267e0f1c59d83efcbf2c41d6caf1074c13149d23edfcf4daaae0" Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: E0319 12:26:42.355004 29612 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n689hd9h86h5f6h78h5dh544h5c7h58h8ch94h545hd6h58fh656h55hd8h5bch659h579hcdh55bh5f8hd8h9bh547h565hfdh56h59ch8ch698q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95qgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5db7b98cb5-vf6p6_openstack(c99754a5-0c68-414f-acd8-42b3d1257db6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/c99754a5-0c68-414f-acd8-42b3d1257db6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: > logger="UnhandledError" Mar 19 12:26:42.360186 master-0 kubenswrapper[29612]: E0319 12:26:42.359844 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/c99754a5-0c68-414f-acd8-42b3d1257db6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" Mar 19 12:26:42.363013 master-0 kubenswrapper[29612]: I0319 12:26:42.360840 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqgjr\" (UniqueName: \"kubernetes.io/projected/1f94d898-d378-4f9c-b25b-24709df3f010-kube-api-access-bqgjr\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:42.371750 master-0 kubenswrapper[29612]: I0319 12:26:42.371597 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 12:26:42.372128 master-0 kubenswrapper[29612]: E0319 12:26:42.372103 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f94d898-d378-4f9c-b25b-24709df3f010" containerName="init" Mar 19 12:26:42.372128 master-0 kubenswrapper[29612]: I0319 12:26:42.372124 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f94d898-d378-4f9c-b25b-24709df3f010" containerName="init" Mar 19 12:26:42.372452 master-0 kubenswrapper[29612]: I0319 12:26:42.372411 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f94d898-d378-4f9c-b25b-24709df3f010" containerName="init" Mar 19 12:26:42.379001 master-0 kubenswrapper[29612]: I0319 12:26:42.378151 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" Mar 19 12:26:42.416485 master-0 kubenswrapper[29612]: I0319 12:26:42.416432 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc987d9f-ch8tp" event={"ID":"1f94d898-d378-4f9c-b25b-24709df3f010","Type":"ContainerDied","Data":"100285a3c1582ab3157544386a8eb72fd29881136ee18d6e29e06ae51eeee774"} Mar 19 12:26:42.416748 master-0 kubenswrapper[29612]: I0319 12:26:42.416550 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 12:26:42.419361 master-0 kubenswrapper[29612]: I0319 12:26:42.419248 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 12:26:42.419361 master-0 kubenswrapper[29612]: I0319 12:26:42.419264 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 12:26:42.419645 master-0 kubenswrapper[29612]: I0319 12:26:42.419565 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 12:26:42.427881 master-0 kubenswrapper[29612]: I0319 12:26:42.427805 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zb6lv" event={"ID":"a0c736a7-20fb-4ca6-9870-d5f978444dcc","Type":"ContainerStarted","Data":"891ed22134867a8b2e184d45bbe0b379036f039cd34a56c2a98997bbcda0b549"} Mar 19 12:26:42.427881 master-0 kubenswrapper[29612]: I0319 12:26:42.427888 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zb6lv" Mar 19 12:26:42.428395 master-0 kubenswrapper[29612]: I0319 12:26:42.427912 29612 scope.go:117] "RemoveContainer" containerID="3dc4f50d2434fee1f750611d893939f822e12e31a1f9e28c8a2a135a0b0bcd40" Mar 19 12:26:42.471852 master-0 kubenswrapper[29612]: I0319 12:26:42.471781 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 12:26:42.516222 master-0 kubenswrapper[29612]: I0319 12:26:42.515860 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zb6lv" podStartSLOduration=14.381062938 podStartE2EDuration="57.515843164s" podCreationTimestamp="2026-03-19 12:25:45 +0000 UTC" firstStartedPulling="2026-03-19 12:25:57.136046842 +0000 UTC m=+964.450215863" lastFinishedPulling="2026-03-19 12:26:40.270827068 +0000 UTC m=+1007.584996089" observedRunningTime="2026-03-19 12:26:42.474239561 +0000 UTC m=+1009.788408602" watchObservedRunningTime="2026-03-19 12:26:42.515843164 +0000 UTC m=+1009.830012185" Mar 19 12:26:42.541514 master-0 kubenswrapper[29612]: I0319 12:26:42.541429 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-config" (OuterVolumeSpecName: "config") pod "1f94d898-d378-4f9c-b25b-24709df3f010" (UID: "1f94d898-d378-4f9c-b25b-24709df3f010"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:42.564464 master-0 kubenswrapper[29612]: I0319 12:26:42.564396 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-lock\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.564897 master-0 kubenswrapper[29612]: I0319 12:26:42.564553 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79hwz\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-kube-api-access-79hwz\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.564897 master-0 kubenswrapper[29612]: I0319 12:26:42.564600 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.564897 master-0 kubenswrapper[29612]: I0319 12:26:42.564618 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-cache\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.564897 master-0 kubenswrapper[29612]: I0319 12:26:42.564733 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.564897 master-0 kubenswrapper[29612]: I0319 12:26:42.564803 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b19d4224-46e4-4da1-bea1-2c405b22582b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c7351c16-b415-481a-b50f-5adb92c7d30e\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.564897 master-0 kubenswrapper[29612]: I0319 12:26:42.564872 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:42.573047 master-0 kubenswrapper[29612]: I0319 12:26:42.572973 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f94d898-d378-4f9c-b25b-24709df3f010" (UID: "1f94d898-d378-4f9c-b25b-24709df3f010"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:42.587957 master-0 kubenswrapper[29612]: I0319 12:26:42.587861 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f94d898-d378-4f9c-b25b-24709df3f010" (UID: "1f94d898-d378-4f9c-b25b-24709df3f010"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:42.592238 master-0 kubenswrapper[29612]: I0319 12:26:42.592202 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f94d898-d378-4f9c-b25b-24709df3f010" (UID: "1f94d898-d378-4f9c-b25b-24709df3f010"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:42.669205 master-0 kubenswrapper[29612]: I0319 12:26:42.669140 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.669484 master-0 kubenswrapper[29612]: I0319 12:26:42.669281 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b19d4224-46e4-4da1-bea1-2c405b22582b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c7351c16-b415-481a-b50f-5adb92c7d30e\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.669484 master-0 kubenswrapper[29612]: I0319 12:26:42.669351 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-lock\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.669484 master-0 kubenswrapper[29612]: I0319 12:26:42.669414 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79hwz\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-kube-api-access-79hwz\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.669484 master-0 kubenswrapper[29612]: I0319 12:26:42.669476 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.669649 master-0 kubenswrapper[29612]: I0319 12:26:42.669500 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-cache\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.669698 master-0 kubenswrapper[29612]: I0319 12:26:42.669648 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:42.669698 master-0 kubenswrapper[29612]: I0319 12:26:42.669669 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:42.669698 master-0 kubenswrapper[29612]: I0319 12:26:42.669686 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f94d898-d378-4f9c-b25b-24709df3f010-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:42.670333 master-0 kubenswrapper[29612]: I0319 12:26:42.670303 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-cache\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.670609 master-0 kubenswrapper[29612]: E0319 12:26:42.670586 29612 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 12:26:42.670609 master-0 kubenswrapper[29612]: E0319 12:26:42.670609 29612 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 12:26:42.670707 master-0 kubenswrapper[29612]: E0319 12:26:42.670695 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift podName:6e8fcd00-2db9-4463-85a5-e9a9ae7ea931 nodeName:}" failed. No retries permitted until 2026-03-19 12:26:43.170669263 +0000 UTC m=+1010.484838284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift") pod "swift-storage-0" (UID: "6e8fcd00-2db9-4463-85a5-e9a9ae7ea931") : configmap "swift-ring-files" not found Mar 19 12:26:42.672056 master-0 kubenswrapper[29612]: I0319 12:26:42.672026 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-lock\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.679746 master-0 kubenswrapper[29612]: I0319 12:26:42.679691 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.683485 master-0 kubenswrapper[29612]: I0319 12:26:42.683428 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:26:42.683785 master-0 kubenswrapper[29612]: I0319 12:26:42.683494 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b19d4224-46e4-4da1-bea1-2c405b22582b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c7351c16-b415-481a-b50f-5adb92c7d30e\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/78a4b67ab2e87b21239bfb33163099988eea564e2771a672c263b23178aef402/globalmount\"" pod="openstack/swift-storage-0" Mar 19 12:26:42.699196 master-0 kubenswrapper[29612]: I0319 12:26:42.699150 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79hwz\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-kube-api-access-79hwz\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:42.943563 master-0 kubenswrapper[29612]: I0319 12:26:42.941339 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bc987d9f-ch8tp"] Mar 19 12:26:42.943563 master-0 kubenswrapper[29612]: I0319 12:26:42.941388 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bc987d9f-ch8tp"] Mar 19 12:26:43.186062 master-0 kubenswrapper[29612]: I0319 12:26:43.185839 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:43.186336 master-0 kubenswrapper[29612]: E0319 12:26:43.186299 29612 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 12:26:43.186336 master-0 kubenswrapper[29612]: E0319 12:26:43.186324 29612 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 12:26:43.186406 master-0 kubenswrapper[29612]: E0319 12:26:43.186377 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift podName:6e8fcd00-2db9-4463-85a5-e9a9ae7ea931 nodeName:}" failed. No retries permitted until 2026-03-19 12:26:44.186360538 +0000 UTC m=+1011.500529569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift") pod "swift-storage-0" (UID: "6e8fcd00-2db9-4463-85a5-e9a9ae7ea931") : configmap "swift-ring-files" not found Mar 19 12:26:43.400565 master-0 kubenswrapper[29612]: I0319 12:26:43.400462 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8gh4l" event={"ID":"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8","Type":"ContainerStarted","Data":"0dcf204364472186973c2b4945d1a692af334b3d87361862d29b05330d1abaa9"} Mar 19 12:26:43.402332 master-0 kubenswrapper[29612]: I0319 12:26:43.402249 29612 generic.go:334] "Generic (PLEG): container finished" podID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerID="4caa14a221cfe68e308c97fc512647a358b5baa77698d372fe983c3e8aaf9cfb" exitCode=0 Mar 19 12:26:43.402332 master-0 kubenswrapper[29612]: I0319 12:26:43.402302 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" event={"ID":"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8","Type":"ContainerDied","Data":"4caa14a221cfe68e308c97fc512647a358b5baa77698d372fe983c3e8aaf9cfb"} Mar 19 12:26:43.602784 master-0 kubenswrapper[29612]: I0319 12:26:43.601080 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tl6x5"] Mar 19 12:26:43.603027 master-0 kubenswrapper[29612]: I0319 12:26:43.602831 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.611769 master-0 kubenswrapper[29612]: I0319 12:26:43.608084 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 12:26:43.611769 master-0 kubenswrapper[29612]: I0319 12:26:43.608313 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 12:26:43.611769 master-0 kubenswrapper[29612]: I0319 12:26:43.608464 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 12:26:43.615745 master-0 kubenswrapper[29612]: I0319 12:26:43.615048 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tl6x5"] Mar 19 12:26:43.715804 master-0 kubenswrapper[29612]: I0319 12:26:43.715629 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddd50353-b1b0-4de2-a180-3461654949ab-etc-swift\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.715976 master-0 kubenswrapper[29612]: I0319 12:26:43.715944 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-ring-data-devices\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.716020 master-0 kubenswrapper[29612]: I0319 12:26:43.716004 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmgdg\" (UniqueName: \"kubernetes.io/projected/ddd50353-b1b0-4de2-a180-3461654949ab-kube-api-access-qmgdg\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.716105 master-0 kubenswrapper[29612]: I0319 12:26:43.716083 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-swiftconf\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.716215 master-0 kubenswrapper[29612]: I0319 12:26:43.716184 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-dispersionconf\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.716257 master-0 kubenswrapper[29612]: I0319 12:26:43.716245 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-scripts\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.716419 master-0 kubenswrapper[29612]: I0319 12:26:43.716400 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-combined-ca-bundle\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.818666 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-combined-ca-bundle\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.818748 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddd50353-b1b0-4de2-a180-3461654949ab-etc-swift\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.818813 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-ring-data-devices\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.818929 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmgdg\" (UniqueName: \"kubernetes.io/projected/ddd50353-b1b0-4de2-a180-3461654949ab-kube-api-access-qmgdg\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.819004 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-swiftconf\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.819048 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-dispersionconf\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.819108 master-0 kubenswrapper[29612]: I0319 12:26:43.819078 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-scripts\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.820225 master-0 kubenswrapper[29612]: I0319 12:26:43.820181 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-scripts\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.820603 master-0 kubenswrapper[29612]: I0319 12:26:43.820546 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-ring-data-devices\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.820906 master-0 kubenswrapper[29612]: I0319 12:26:43.820876 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddd50353-b1b0-4de2-a180-3461654949ab-etc-swift\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.823976 master-0 kubenswrapper[29612]: I0319 12:26:43.823927 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-dispersionconf\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.824137 master-0 kubenswrapper[29612]: I0319 12:26:43.824093 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-swiftconf\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.825279 master-0 kubenswrapper[29612]: I0319 12:26:43.825237 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-combined-ca-bundle\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:43.865156 master-0 kubenswrapper[29612]: I0319 12:26:43.861284 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmgdg\" (UniqueName: \"kubernetes.io/projected/ddd50353-b1b0-4de2-a180-3461654949ab-kube-api-access-qmgdg\") pod \"swift-ring-rebalance-tl6x5\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:44.017742 master-0 kubenswrapper[29612]: I0319 12:26:44.017585 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:26:44.250988 master-0 kubenswrapper[29612]: I0319 12:26:44.250917 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:44.251405 master-0 kubenswrapper[29612]: E0319 12:26:44.251376 29612 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 12:26:44.251405 master-0 kubenswrapper[29612]: E0319 12:26:44.251402 29612 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 12:26:44.251791 master-0 kubenswrapper[29612]: E0319 12:26:44.251769 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift podName:6e8fcd00-2db9-4463-85a5-e9a9ae7ea931 nodeName:}" failed. No retries permitted until 2026-03-19 12:26:46.251747602 +0000 UTC m=+1013.565916623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift") pod "swift-storage-0" (UID: "6e8fcd00-2db9-4463-85a5-e9a9ae7ea931") : configmap "swift-ring-files" not found Mar 19 12:26:44.538576 master-0 kubenswrapper[29612]: I0319 12:26:44.527470 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" event={"ID":"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8","Type":"ContainerStarted","Data":"81709da4c380d1a7f298cd9c18ef65a4c5bfa7172b54ea6bd8526429aadb1cd5"} Mar 19 12:26:44.538576 master-0 kubenswrapper[29612]: I0319 12:26:44.528354 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:44.556017 master-0 kubenswrapper[29612]: I0319 12:26:44.554974 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8gh4l" event={"ID":"9bdbf1ef-8fcf-420d-bf0f-62a694b14df8","Type":"ContainerStarted","Data":"8ae10136fd2ab7d088464aa5fe84bdff5fb5d2353c16ca011d413d9489029949"} Mar 19 12:26:44.559331 master-0 kubenswrapper[29612]: I0319 12:26:44.556746 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:26:44.559331 master-0 kubenswrapper[29612]: I0319 12:26:44.556782 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:26:44.559331 master-0 kubenswrapper[29612]: I0319 12:26:44.558245 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tl6x5"] Mar 19 12:26:44.590314 master-0 kubenswrapper[29612]: I0319 12:26:44.590219 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" event={"ID":"c99754a5-0c68-414f-acd8-42b3d1257db6","Type":"ContainerStarted","Data":"ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f"} Mar 19 12:26:44.592654 master-0 kubenswrapper[29612]: I0319 12:26:44.591168 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:44.599433 master-0 kubenswrapper[29612]: I0319 12:26:44.596797 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" podStartSLOduration=4.596772846 podStartE2EDuration="4.596772846s" podCreationTimestamp="2026-03-19 12:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:26:44.580768212 +0000 UTC m=+1011.894937243" watchObservedRunningTime="2026-03-19 12:26:44.596772846 +0000 UTC m=+1011.910941867" Mar 19 12:26:44.683137 master-0 kubenswrapper[29612]: I0319 12:26:44.683024 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8gh4l" podStartSLOduration=17.525200581 podStartE2EDuration="59.683004056s" podCreationTimestamp="2026-03-19 12:25:45 +0000 UTC" firstStartedPulling="2026-03-19 12:25:58.108896746 +0000 UTC m=+965.423065767" lastFinishedPulling="2026-03-19 12:26:40.266700221 +0000 UTC m=+1007.580869242" observedRunningTime="2026-03-19 12:26:44.61662639 +0000 UTC m=+1011.930795411" watchObservedRunningTime="2026-03-19 12:26:44.683004056 +0000 UTC m=+1011.997173077" Mar 19 12:26:44.689052 master-0 kubenswrapper[29612]: I0319 12:26:44.688503 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" podStartSLOduration=43.688486702 podStartE2EDuration="43.688486702s" podCreationTimestamp="2026-03-19 12:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:26:44.642946578 +0000 UTC m=+1011.957115599" watchObservedRunningTime="2026-03-19 12:26:44.688486702 +0000 UTC m=+1012.002655723" Mar 19 12:26:44.922028 master-0 kubenswrapper[29612]: I0319 12:26:44.921947 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f94d898-d378-4f9c-b25b-24709df3f010" path="/var/lib/kubelet/pods/1f94d898-d378-4f9c-b25b-24709df3f010/volumes" Mar 19 12:26:45.173334 master-0 kubenswrapper[29612]: I0319 12:26:45.173210 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b19d4224-46e4-4da1-bea1-2c405b22582b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c7351c16-b415-481a-b50f-5adb92c7d30e\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:45.606950 master-0 kubenswrapper[29612]: I0319 12:26:45.606822 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tl6x5" event={"ID":"ddd50353-b1b0-4de2-a180-3461654949ab","Type":"ContainerStarted","Data":"2b1683386e86ffbdc9bdeb81e7b1c87e1166d6a7a11d447c72363b8efb091c08"} Mar 19 12:26:46.274724 master-0 kubenswrapper[29612]: I0319 12:26:46.274588 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:46.275071 master-0 kubenswrapper[29612]: E0319 12:26:46.274984 29612 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 12:26:46.275141 master-0 kubenswrapper[29612]: E0319 12:26:46.275070 29612 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 12:26:46.275256 master-0 kubenswrapper[29612]: E0319 12:26:46.275203 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift podName:6e8fcd00-2db9-4463-85a5-e9a9ae7ea931 nodeName:}" failed. No retries permitted until 2026-03-19 12:26:50.2751652 +0000 UTC m=+1017.589334271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift") pod "swift-storage-0" (UID: "6e8fcd00-2db9-4463-85a5-e9a9ae7ea931") : configmap "swift-ring-files" not found Mar 19 12:26:47.649825 master-0 kubenswrapper[29612]: I0319 12:26:47.649150 29612 generic.go:334] "Generic (PLEG): container finished" podID="b8fe68fc-3f50-4352-b9b5-c2b597c7d88c" containerID="d2d6b14e534f0aa6623d9c9d703a89b9a5909298f63b20e270eeea516b59fd17" exitCode=0 Mar 19 12:26:47.651016 master-0 kubenswrapper[29612]: I0319 12:26:47.649234 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c","Type":"ContainerDied","Data":"d2d6b14e534f0aa6623d9c9d703a89b9a5909298f63b20e270eeea516b59fd17"} Mar 19 12:26:48.663331 master-0 kubenswrapper[29612]: I0319 12:26:48.663226 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"79985401-37f5-4b30-b80a-4440cf5e6701","Type":"ContainerStarted","Data":"5138768e18fd260017663e967a060067744fecedf5e935273c06147256bdb073"} Mar 19 12:26:48.666297 master-0 kubenswrapper[29612]: I0319 12:26:48.666232 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8fe68fc-3f50-4352-b9b5-c2b597c7d88c","Type":"ContainerStarted","Data":"619f263bd59413c3cb96e9f26d04821f8c2156477a4b6b64510e83d1a883439e"} Mar 19 12:26:48.668228 master-0 kubenswrapper[29612]: I0319 12:26:48.668159 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-476pd" event={"ID":"e7d20a75-a002-4f3a-845c-4178e1e305e1","Type":"ContainerStarted","Data":"0c4d583079abedf0b01145973ac81e07074d29059b4c5dc040dfeee24d3e113f"} Mar 19 12:26:48.670334 master-0 kubenswrapper[29612]: I0319 12:26:48.670286 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a8bc9a3b-3a15-41cc-b646-316fbdf146a2","Type":"ContainerStarted","Data":"d4b2abe6eb859c456f3483b46ed1559ece1b68fe2e9ee63923ca9394a399a659"} Mar 19 12:26:48.672684 master-0 kubenswrapper[29612]: I0319 12:26:48.672646 29612 generic.go:334] "Generic (PLEG): container finished" podID="c62b6b3b-400d-4268-bc4f-cfaf95665cea" containerID="fc583ad987a3b4e90f11080039646c44f074ce9fca7feb31ecef6ea09f8154ca" exitCode=0 Mar 19 12:26:48.672751 master-0 kubenswrapper[29612]: I0319 12:26:48.672693 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c62b6b3b-400d-4268-bc4f-cfaf95665cea","Type":"ContainerDied","Data":"fc583ad987a3b4e90f11080039646c44f074ce9fca7feb31ecef6ea09f8154ca"} Mar 19 12:26:48.716510 master-0 kubenswrapper[29612]: I0319 12:26:48.715450 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.936318592 podStartE2EDuration="58.715415174s" podCreationTimestamp="2026-03-19 12:25:50 +0000 UTC" firstStartedPulling="2026-03-19 12:25:57.379721216 +0000 UTC m=+964.693890237" lastFinishedPulling="2026-03-19 12:26:48.158817798 +0000 UTC m=+1015.472986819" observedRunningTime="2026-03-19 12:26:48.690463864 +0000 UTC m=+1016.004632885" watchObservedRunningTime="2026-03-19 12:26:48.715415174 +0000 UTC m=+1016.029584195" Mar 19 12:26:48.757664 master-0 kubenswrapper[29612]: I0319 12:26:48.746979 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.563545282 podStartE2EDuration="1m11.74696166s" podCreationTimestamp="2026-03-19 12:25:37 +0000 UTC" firstStartedPulling="2026-03-19 12:25:57.15849281 +0000 UTC m=+964.472661831" lastFinishedPulling="2026-03-19 12:26:40.341909188 +0000 UTC m=+1007.656078209" observedRunningTime="2026-03-19 12:26:48.735840984 +0000 UTC m=+1016.050010025" watchObservedRunningTime="2026-03-19 12:26:48.74696166 +0000 UTC m=+1016.061130681" Mar 19 12:26:48.809720 master-0 kubenswrapper[29612]: I0319 12:26:48.806572 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.389497997 podStartE2EDuration="1m4.806537173s" podCreationTimestamp="2026-03-19 12:25:44 +0000 UTC" firstStartedPulling="2026-03-19 12:25:58.76182195 +0000 UTC m=+966.075990971" lastFinishedPulling="2026-03-19 12:26:48.178861136 +0000 UTC m=+1015.493030147" observedRunningTime="2026-03-19 12:26:48.803162507 +0000 UTC m=+1016.117331528" watchObservedRunningTime="2026-03-19 12:26:48.806537173 +0000 UTC m=+1016.120706194" Mar 19 12:26:48.855751 master-0 kubenswrapper[29612]: I0319 12:26:48.852728 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-476pd" podStartSLOduration=39.613788287 podStartE2EDuration="48.852696575s" podCreationTimestamp="2026-03-19 12:26:00 +0000 UTC" firstStartedPulling="2026-03-19 12:26:38.924594113 +0000 UTC m=+1006.238763134" lastFinishedPulling="2026-03-19 12:26:48.163502401 +0000 UTC m=+1015.477671422" observedRunningTime="2026-03-19 12:26:48.833630053 +0000 UTC m=+1016.147799074" watchObservedRunningTime="2026-03-19 12:26:48.852696575 +0000 UTC m=+1016.166865596" Mar 19 12:26:49.212169 master-0 kubenswrapper[29612]: I0319 12:26:49.211513 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 12:26:49.212169 master-0 kubenswrapper[29612]: I0319 12:26:49.211588 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 12:26:49.276563 master-0 kubenswrapper[29612]: I0319 12:26:49.276487 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 12:26:49.838617 master-0 kubenswrapper[29612]: I0319 12:26:49.837502 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 12:26:49.898607 master-0 kubenswrapper[29612]: I0319 12:26:49.898529 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 12:26:50.334729 master-0 kubenswrapper[29612]: I0319 12:26:50.334666 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:50.334958 master-0 kubenswrapper[29612]: E0319 12:26:50.334860 29612 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 12:26:50.334958 master-0 kubenswrapper[29612]: E0319 12:26:50.334875 29612 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 12:26:50.334958 master-0 kubenswrapper[29612]: E0319 12:26:50.334922 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift podName:6e8fcd00-2db9-4463-85a5-e9a9ae7ea931 nodeName:}" failed. No retries permitted until 2026-03-19 12:26:58.334909004 +0000 UTC m=+1025.649078025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift") pod "swift-storage-0" (UID: "6e8fcd00-2db9-4463-85a5-e9a9ae7ea931") : configmap "swift-ring-files" not found Mar 19 12:26:50.691054 master-0 kubenswrapper[29612]: I0319 12:26:50.690991 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 12:26:50.735771 master-0 kubenswrapper[29612]: I0319 12:26:50.735702 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 12:26:50.874220 master-0 kubenswrapper[29612]: I0319 12:26:50.873786 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:26:50.988802 master-0 kubenswrapper[29612]: I0319 12:26:50.988754 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db7b98cb5-vf6p6"] Mar 19 12:26:50.989323 master-0 kubenswrapper[29612]: I0319 12:26:50.989290 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerName="dnsmasq-dns" containerID="cri-o://ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f" gracePeriod=10 Mar 19 12:26:50.992703 master-0 kubenswrapper[29612]: I0319 12:26:50.992662 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:51.278097 master-0 kubenswrapper[29612]: I0319 12:26:51.277079 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 12:26:51.329851 master-0 kubenswrapper[29612]: I0319 12:26:51.329650 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 12:26:51.511441 master-0 kubenswrapper[29612]: I0319 12:26:51.511394 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:51.572189 master-0 kubenswrapper[29612]: I0319 12:26:51.566470 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-ovsdbserver-nb\") pod \"c99754a5-0c68-414f-acd8-42b3d1257db6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " Mar 19 12:26:51.572189 master-0 kubenswrapper[29612]: I0319 12:26:51.566553 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95qgf\" (UniqueName: \"kubernetes.io/projected/c99754a5-0c68-414f-acd8-42b3d1257db6-kube-api-access-95qgf\") pod \"c99754a5-0c68-414f-acd8-42b3d1257db6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " Mar 19 12:26:51.572189 master-0 kubenswrapper[29612]: I0319 12:26:51.566803 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-config\") pod \"c99754a5-0c68-414f-acd8-42b3d1257db6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " Mar 19 12:26:51.572189 master-0 kubenswrapper[29612]: I0319 12:26:51.566849 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-dns-svc\") pod \"c99754a5-0c68-414f-acd8-42b3d1257db6\" (UID: \"c99754a5-0c68-414f-acd8-42b3d1257db6\") " Mar 19 12:26:51.584912 master-0 kubenswrapper[29612]: I0319 12:26:51.583648 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99754a5-0c68-414f-acd8-42b3d1257db6-kube-api-access-95qgf" (OuterVolumeSpecName: "kube-api-access-95qgf") pod "c99754a5-0c68-414f-acd8-42b3d1257db6" (UID: "c99754a5-0c68-414f-acd8-42b3d1257db6"). InnerVolumeSpecName "kube-api-access-95qgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:26:51.628678 master-0 kubenswrapper[29612]: I0319 12:26:51.627315 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c99754a5-0c68-414f-acd8-42b3d1257db6" (UID: "c99754a5-0c68-414f-acd8-42b3d1257db6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:51.637137 master-0 kubenswrapper[29612]: I0319 12:26:51.636219 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-config" (OuterVolumeSpecName: "config") pod "c99754a5-0c68-414f-acd8-42b3d1257db6" (UID: "c99754a5-0c68-414f-acd8-42b3d1257db6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:51.650696 master-0 kubenswrapper[29612]: I0319 12:26:51.647111 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c99754a5-0c68-414f-acd8-42b3d1257db6" (UID: "c99754a5-0c68-414f-acd8-42b3d1257db6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:26:51.670664 master-0 kubenswrapper[29612]: I0319 12:26:51.669045 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:51.670664 master-0 kubenswrapper[29612]: I0319 12:26:51.669100 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:51.670664 master-0 kubenswrapper[29612]: I0319 12:26:51.669115 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c99754a5-0c68-414f-acd8-42b3d1257db6-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:51.670664 master-0 kubenswrapper[29612]: I0319 12:26:51.669132 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95qgf\" (UniqueName: \"kubernetes.io/projected/c99754a5-0c68-414f-acd8-42b3d1257db6-kube-api-access-95qgf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:26:51.705782 master-0 kubenswrapper[29612]: I0319 12:26:51.704202 29612 generic.go:334] "Generic (PLEG): container finished" podID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerID="ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f" exitCode=0 Mar 19 12:26:51.705782 master-0 kubenswrapper[29612]: I0319 12:26:51.704282 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" event={"ID":"c99754a5-0c68-414f-acd8-42b3d1257db6","Type":"ContainerDied","Data":"ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f"} Mar 19 12:26:51.705782 master-0 kubenswrapper[29612]: I0319 12:26:51.704313 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" event={"ID":"c99754a5-0c68-414f-acd8-42b3d1257db6","Type":"ContainerDied","Data":"ef4d93828d5c267e0f1c59d83efcbf2c41d6caf1074c13149d23edfcf4daaae0"} Mar 19 12:26:51.705782 master-0 kubenswrapper[29612]: I0319 12:26:51.704339 29612 scope.go:117] "RemoveContainer" containerID="ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f" Mar 19 12:26:51.705782 master-0 kubenswrapper[29612]: I0319 12:26:51.704285 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db7b98cb5-vf6p6" Mar 19 12:26:51.708246 master-0 kubenswrapper[29612]: I0319 12:26:51.707608 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c62b6b3b-400d-4268-bc4f-cfaf95665cea","Type":"ContainerStarted","Data":"520edd68e9fa552f58e4a7c8ff9213857cb9a1e4eb1b4b050906b55d07c1b163"} Mar 19 12:26:51.710837 master-0 kubenswrapper[29612]: I0319 12:26:51.710765 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tl6x5" event={"ID":"ddd50353-b1b0-4de2-a180-3461654949ab","Type":"ContainerStarted","Data":"a96775d9af9670e0ef66c775afcd14be76e2a607b880c679d69a21b2ac2236c9"} Mar 19 12:26:51.738978 master-0 kubenswrapper[29612]: I0319 12:26:51.738093 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.637917065 podStartE2EDuration="1m13.738074738s" podCreationTimestamp="2026-03-19 12:25:38 +0000 UTC" firstStartedPulling="2026-03-19 12:25:57.168497614 +0000 UTC m=+964.482666675" lastFinishedPulling="2026-03-19 12:26:40.268655327 +0000 UTC m=+1007.582824348" observedRunningTime="2026-03-19 12:26:51.73780656 +0000 UTC m=+1019.051975581" watchObservedRunningTime="2026-03-19 12:26:51.738074738 +0000 UTC m=+1019.052243779" Mar 19 12:26:51.748700 master-0 kubenswrapper[29612]: I0319 12:26:51.747671 29612 scope.go:117] "RemoveContainer" containerID="96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd" Mar 19 12:26:51.761769 master-0 kubenswrapper[29612]: I0319 12:26:51.761716 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db7b98cb5-vf6p6"] Mar 19 12:26:51.770160 master-0 kubenswrapper[29612]: I0319 12:26:51.770080 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5db7b98cb5-vf6p6"] Mar 19 12:26:51.788859 master-0 kubenswrapper[29612]: I0319 12:26:51.788286 29612 scope.go:117] "RemoveContainer" containerID="ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f" Mar 19 12:26:51.788859 master-0 kubenswrapper[29612]: E0319 12:26:51.788660 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f\": container with ID starting with ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f not found: ID does not exist" containerID="ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f" Mar 19 12:26:51.788859 master-0 kubenswrapper[29612]: I0319 12:26:51.788696 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f"} err="failed to get container status \"ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f\": rpc error: code = NotFound desc = could not find container \"ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f\": container with ID starting with ed87aea6ef7f30735918227c9264b1aaa902d53448af48c2bfddc9655213b38f not found: ID does not exist" Mar 19 12:26:51.788859 master-0 kubenswrapper[29612]: I0319 12:26:51.788723 29612 scope.go:117] "RemoveContainer" containerID="96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd" Mar 19 12:26:51.789098 master-0 kubenswrapper[29612]: I0319 12:26:51.789013 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tl6x5" podStartSLOduration=2.256395301 podStartE2EDuration="8.788984844s" podCreationTimestamp="2026-03-19 12:26:43 +0000 UTC" firstStartedPulling="2026-03-19 12:26:44.544309526 +0000 UTC m=+1011.858478537" lastFinishedPulling="2026-03-19 12:26:51.076899059 +0000 UTC m=+1018.391068080" observedRunningTime="2026-03-19 12:26:51.782091758 +0000 UTC m=+1019.096260779" watchObservedRunningTime="2026-03-19 12:26:51.788984844 +0000 UTC m=+1019.103153865" Mar 19 12:26:51.789192 master-0 kubenswrapper[29612]: E0319 12:26:51.789040 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd\": container with ID starting with 96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd not found: ID does not exist" containerID="96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd" Mar 19 12:26:51.789192 master-0 kubenswrapper[29612]: I0319 12:26:51.789187 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd"} err="failed to get container status \"96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd\": rpc error: code = NotFound desc = could not find container \"96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd\": container with ID starting with 96bbb96ebc120addbe6a6855c4ab1c45c05665aeb6fdcb66811da204e48c1bdd not found: ID does not exist" Mar 19 12:26:51.796394 master-0 kubenswrapper[29612]: I0319 12:26:51.796351 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 12:26:52.061048 master-0 kubenswrapper[29612]: I0319 12:26:52.060958 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 12:26:52.061672 master-0 kubenswrapper[29612]: E0319 12:26:52.061531 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerName="dnsmasq-dns" Mar 19 12:26:52.061672 master-0 kubenswrapper[29612]: I0319 12:26:52.061552 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerName="dnsmasq-dns" Mar 19 12:26:52.061672 master-0 kubenswrapper[29612]: E0319 12:26:52.061582 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerName="init" Mar 19 12:26:52.061672 master-0 kubenswrapper[29612]: I0319 12:26:52.061591 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerName="init" Mar 19 12:26:52.061993 master-0 kubenswrapper[29612]: I0319 12:26:52.061950 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" containerName="dnsmasq-dns" Mar 19 12:26:52.063386 master-0 kubenswrapper[29612]: I0319 12:26:52.063341 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 12:26:52.076667 master-0 kubenswrapper[29612]: I0319 12:26:52.072543 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 12:26:52.076667 master-0 kubenswrapper[29612]: I0319 12:26:52.072982 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 12:26:52.076667 master-0 kubenswrapper[29612]: I0319 12:26:52.073216 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 12:26:52.136773 master-0 kubenswrapper[29612]: I0319 12:26:52.116597 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.207812 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.207888 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bb1965b-d806-4a7a-8650-4df62f5a71d9-config\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.207933 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bb1965b-d806-4a7a-8650-4df62f5a71d9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.207950 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.207978 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bb1965b-d806-4a7a-8650-4df62f5a71d9-scripts\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.208043 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.208093 master-0 kubenswrapper[29612]: I0319 12:26:52.208088 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pcfp\" (UniqueName: \"kubernetes.io/projected/4bb1965b-d806-4a7a-8650-4df62f5a71d9-kube-api-access-8pcfp\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.309686 master-0 kubenswrapper[29612]: I0319 12:26:52.309494 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.309872 master-0 kubenswrapper[29612]: I0319 12:26:52.309796 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bb1965b-d806-4a7a-8650-4df62f5a71d9-config\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.310513 master-0 kubenswrapper[29612]: I0319 12:26:52.310486 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bb1965b-d806-4a7a-8650-4df62f5a71d9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.310629 master-0 kubenswrapper[29612]: I0319 12:26:52.310543 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.310629 master-0 kubenswrapper[29612]: I0319 12:26:52.310603 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bb1965b-d806-4a7a-8650-4df62f5a71d9-scripts\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.310719 master-0 kubenswrapper[29612]: I0319 12:26:52.310654 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bb1965b-d806-4a7a-8650-4df62f5a71d9-config\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.310935 master-0 kubenswrapper[29612]: I0319 12:26:52.310896 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.311061 master-0 kubenswrapper[29612]: I0319 12:26:52.310999 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pcfp\" (UniqueName: \"kubernetes.io/projected/4bb1965b-d806-4a7a-8650-4df62f5a71d9-kube-api-access-8pcfp\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.312710 master-0 kubenswrapper[29612]: I0319 12:26:52.311181 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bb1965b-d806-4a7a-8650-4df62f5a71d9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.312710 master-0 kubenswrapper[29612]: I0319 12:26:52.311496 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bb1965b-d806-4a7a-8650-4df62f5a71d9-scripts\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.318651 master-0 kubenswrapper[29612]: I0319 12:26:52.316214 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.318651 master-0 kubenswrapper[29612]: I0319 12:26:52.316428 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.318651 master-0 kubenswrapper[29612]: I0319 12:26:52.317371 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bb1965b-d806-4a7a-8650-4df62f5a71d9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.329407 master-0 kubenswrapper[29612]: I0319 12:26:52.329365 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pcfp\" (UniqueName: \"kubernetes.io/projected/4bb1965b-d806-4a7a-8650-4df62f5a71d9-kube-api-access-8pcfp\") pod \"ovn-northd-0\" (UID: \"4bb1965b-d806-4a7a-8650-4df62f5a71d9\") " pod="openstack/ovn-northd-0" Mar 19 12:26:52.432282 master-0 kubenswrapper[29612]: I0319 12:26:52.431932 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 12:26:52.903451 master-0 kubenswrapper[29612]: I0319 12:26:52.903347 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 12:26:52.909198 master-0 kubenswrapper[29612]: W0319 12:26:52.908800 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb1965b_d806_4a7a_8650_4df62f5a71d9.slice/crio-1cc1c04bfbe1eb64dee891db11e81256dbc9a9df84c4b37d87467b3c23fd4dd5 WatchSource:0}: Error finding container 1cc1c04bfbe1eb64dee891db11e81256dbc9a9df84c4b37d87467b3c23fd4dd5: Status 404 returned error can't find the container with id 1cc1c04bfbe1eb64dee891db11e81256dbc9a9df84c4b37d87467b3c23fd4dd5 Mar 19 12:26:52.925162 master-0 kubenswrapper[29612]: I0319 12:26:52.925088 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99754a5-0c68-414f-acd8-42b3d1257db6" path="/var/lib/kubelet/pods/c99754a5-0c68-414f-acd8-42b3d1257db6/volumes" Mar 19 12:26:53.744166 master-0 kubenswrapper[29612]: I0319 12:26:53.742842 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bb1965b-d806-4a7a-8650-4df62f5a71d9","Type":"ContainerStarted","Data":"1cc1c04bfbe1eb64dee891db11e81256dbc9a9df84c4b37d87467b3c23fd4dd5"} Mar 19 12:26:54.753426 master-0 kubenswrapper[29612]: I0319 12:26:54.753375 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bb1965b-d806-4a7a-8650-4df62f5a71d9","Type":"ContainerStarted","Data":"5aa17a7593f1098a231f41a117c7e9a237938df65bff732646b1ae5261dd575e"} Mar 19 12:26:55.340180 master-0 kubenswrapper[29612]: I0319 12:26:55.340105 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 12:26:55.435580 master-0 kubenswrapper[29612]: I0319 12:26:55.435522 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 12:26:55.771651 master-0 kubenswrapper[29612]: I0319 12:26:55.771581 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bb1965b-d806-4a7a-8650-4df62f5a71d9","Type":"ContainerStarted","Data":"67fff60c1672f70127009c2a15c5a680acb2d7d24d637dfdeb6835189b1b2a0e"} Mar 19 12:26:55.796223 master-0 kubenswrapper[29612]: I0319 12:26:55.796117 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.304164387 podStartE2EDuration="4.796091192s" podCreationTimestamp="2026-03-19 12:26:51 +0000 UTC" firstStartedPulling="2026-03-19 12:26:52.913026536 +0000 UTC m=+1020.227195567" lastFinishedPulling="2026-03-19 12:26:54.404953351 +0000 UTC m=+1021.719122372" observedRunningTime="2026-03-19 12:26:55.794871577 +0000 UTC m=+1023.109040608" watchObservedRunningTime="2026-03-19 12:26:55.796091192 +0000 UTC m=+1023.110260243" Mar 19 12:26:56.783508 master-0 kubenswrapper[29612]: I0319 12:26:56.783423 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 12:26:57.544572 master-0 kubenswrapper[29612]: I0319 12:26:57.544491 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ca05-account-create-update-cz8p9"] Mar 19 12:26:57.546404 master-0 kubenswrapper[29612]: I0319 12:26:57.546358 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.553188 master-0 kubenswrapper[29612]: I0319 12:26:57.553116 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 12:26:57.571935 master-0 kubenswrapper[29612]: I0319 12:26:57.564699 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ca05-account-create-update-cz8p9"] Mar 19 12:26:57.630061 master-0 kubenswrapper[29612]: I0319 12:26:57.629966 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nmkvn"] Mar 19 12:26:57.634042 master-0 kubenswrapper[29612]: I0319 12:26:57.633966 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.636942 master-0 kubenswrapper[29612]: I0319 12:26:57.636908 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 12:26:57.638554 master-0 kubenswrapper[29612]: I0319 12:26:57.638509 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hf2k\" (UniqueName: \"kubernetes.io/projected/39d30f3c-9ff6-486e-b733-db361f4896c1-kube-api-access-6hf2k\") pod \"glance-ca05-account-create-update-cz8p9\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.638661 master-0 kubenswrapper[29612]: I0319 12:26:57.638584 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39d30f3c-9ff6-486e-b733-db361f4896c1-operator-scripts\") pod \"glance-ca05-account-create-update-cz8p9\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.655757 master-0 kubenswrapper[29612]: I0319 12:26:57.655476 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nmkvn"] Mar 19 12:26:57.741339 master-0 kubenswrapper[29612]: I0319 12:26:57.741252 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hf2k\" (UniqueName: \"kubernetes.io/projected/39d30f3c-9ff6-486e-b733-db361f4896c1-kube-api-access-6hf2k\") pod \"glance-ca05-account-create-update-cz8p9\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.741339 master-0 kubenswrapper[29612]: I0319 12:26:57.741335 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c7116-e99b-4ca4-b0f3-15956fee5f34-operator-scripts\") pod \"root-account-create-update-nmkvn\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.741669 master-0 kubenswrapper[29612]: I0319 12:26:57.741405 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39d30f3c-9ff6-486e-b733-db361f4896c1-operator-scripts\") pod \"glance-ca05-account-create-update-cz8p9\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.741669 master-0 kubenswrapper[29612]: I0319 12:26:57.741451 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bhs\" (UniqueName: \"kubernetes.io/projected/900c7116-e99b-4ca4-b0f3-15956fee5f34-kube-api-access-45bhs\") pod \"root-account-create-update-nmkvn\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.742467 master-0 kubenswrapper[29612]: I0319 12:26:57.742422 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39d30f3c-9ff6-486e-b733-db361f4896c1-operator-scripts\") pod \"glance-ca05-account-create-update-cz8p9\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.764290 master-0 kubenswrapper[29612]: I0319 12:26:57.764241 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hf2k\" (UniqueName: \"kubernetes.io/projected/39d30f3c-9ff6-486e-b733-db361f4896c1-kube-api-access-6hf2k\") pod \"glance-ca05-account-create-update-cz8p9\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.844580 master-0 kubenswrapper[29612]: I0319 12:26:57.844440 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c7116-e99b-4ca4-b0f3-15956fee5f34-operator-scripts\") pod \"root-account-create-update-nmkvn\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.845410 master-0 kubenswrapper[29612]: I0319 12:26:57.844541 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45bhs\" (UniqueName: \"kubernetes.io/projected/900c7116-e99b-4ca4-b0f3-15956fee5f34-kube-api-access-45bhs\") pod \"root-account-create-update-nmkvn\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.847175 master-0 kubenswrapper[29612]: I0319 12:26:57.845709 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c7116-e99b-4ca4-b0f3-15956fee5f34-operator-scripts\") pod \"root-account-create-update-nmkvn\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.873212 master-0 kubenswrapper[29612]: I0319 12:26:57.865046 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bhs\" (UniqueName: \"kubernetes.io/projected/900c7116-e99b-4ca4-b0f3-15956fee5f34-kube-api-access-45bhs\") pod \"root-account-create-update-nmkvn\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:57.890866 master-0 kubenswrapper[29612]: I0319 12:26:57.888316 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:26:57.969935 master-0 kubenswrapper[29612]: I0319 12:26:57.969839 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nmkvn" Mar 19 12:26:58.363346 master-0 kubenswrapper[29612]: I0319 12:26:58.363128 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:26:58.364007 master-0 kubenswrapper[29612]: E0319 12:26:58.363328 29612 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 12:26:58.364007 master-0 kubenswrapper[29612]: E0319 12:26:58.364004 29612 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 12:26:58.364150 master-0 kubenswrapper[29612]: E0319 12:26:58.364069 29612 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift podName:6e8fcd00-2db9-4463-85a5-e9a9ae7ea931 nodeName:}" failed. No retries permitted until 2026-03-19 12:27:14.364051085 +0000 UTC m=+1041.678220106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift") pod "swift-storage-0" (UID: "6e8fcd00-2db9-4463-85a5-e9a9ae7ea931") : configmap "swift-ring-files" not found Mar 19 12:26:58.423488 master-0 kubenswrapper[29612]: I0319 12:26:58.423421 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ca05-account-create-update-cz8p9"] Mar 19 12:26:58.586609 master-0 kubenswrapper[29612]: I0319 12:26:58.586546 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nmkvn"] Mar 19 12:26:58.820655 master-0 kubenswrapper[29612]: I0319 12:26:58.820570 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nmkvn" event={"ID":"900c7116-e99b-4ca4-b0f3-15956fee5f34","Type":"ContainerStarted","Data":"9a1fa36ffe5971d1c957e993c2930f289e8cd39157dc149a5b21a4f324d8c664"} Mar 19 12:26:58.820655 master-0 kubenswrapper[29612]: I0319 12:26:58.820626 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nmkvn" event={"ID":"900c7116-e99b-4ca4-b0f3-15956fee5f34","Type":"ContainerStarted","Data":"356e82a03e7bd82630edd8a4dd96f8f8757f29e90995e7671d72bb92888757ea"} Mar 19 12:26:58.822187 master-0 kubenswrapper[29612]: I0319 12:26:58.822146 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca05-account-create-update-cz8p9" event={"ID":"39d30f3c-9ff6-486e-b733-db361f4896c1","Type":"ContainerStarted","Data":"f644e8c13a8d8a0b70d73a571330ff156be106f1b600bcbe8b33cfa1ae5aba3b"} Mar 19 12:26:58.822187 master-0 kubenswrapper[29612]: I0319 12:26:58.822179 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca05-account-create-update-cz8p9" event={"ID":"39d30f3c-9ff6-486e-b733-db361f4896c1","Type":"ContainerStarted","Data":"4645dec3c339794a200e7d7b3903a5cb498ec1d56726c4c48591d9fcf204fecd"} Mar 19 12:26:58.849414 master-0 kubenswrapper[29612]: I0319 12:26:58.846555 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-ca05-account-create-update-cz8p9" podStartSLOduration=1.8465318659999999 podStartE2EDuration="1.846531866s" podCreationTimestamp="2026-03-19 12:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:26:58.845760874 +0000 UTC m=+1026.159929915" watchObservedRunningTime="2026-03-19 12:26:58.846531866 +0000 UTC m=+1026.160700897" Mar 19 12:26:59.835029 master-0 kubenswrapper[29612]: I0319 12:26:59.834958 29612 generic.go:334] "Generic (PLEG): container finished" podID="900c7116-e99b-4ca4-b0f3-15956fee5f34" containerID="9a1fa36ffe5971d1c957e993c2930f289e8cd39157dc149a5b21a4f324d8c664" exitCode=0 Mar 19 12:26:59.835345 master-0 kubenswrapper[29612]: I0319 12:26:59.835088 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nmkvn" event={"ID":"900c7116-e99b-4ca4-b0f3-15956fee5f34","Type":"ContainerDied","Data":"9a1fa36ffe5971d1c957e993c2930f289e8cd39157dc149a5b21a4f324d8c664"} Mar 19 12:26:59.837178 master-0 kubenswrapper[29612]: I0319 12:26:59.837133 29612 generic.go:334] "Generic (PLEG): container finished" podID="ddd50353-b1b0-4de2-a180-3461654949ab" containerID="a96775d9af9670e0ef66c775afcd14be76e2a607b880c679d69a21b2ac2236c9" exitCode=0 Mar 19 12:26:59.837255 master-0 kubenswrapper[29612]: I0319 12:26:59.837209 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tl6x5" event={"ID":"ddd50353-b1b0-4de2-a180-3461654949ab","Type":"ContainerDied","Data":"a96775d9af9670e0ef66c775afcd14be76e2a607b880c679d69a21b2ac2236c9"} Mar 19 12:26:59.839968 master-0 kubenswrapper[29612]: I0319 12:26:59.839941 29612 generic.go:334] "Generic (PLEG): container finished" podID="39d30f3c-9ff6-486e-b733-db361f4896c1" containerID="f644e8c13a8d8a0b70d73a571330ff156be106f1b600bcbe8b33cfa1ae5aba3b" exitCode=0 Mar 19 12:26:59.840126 master-0 kubenswrapper[29612]: I0319 12:26:59.839975 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca05-account-create-update-cz8p9" event={"ID":"39d30f3c-9ff6-486e-b733-db361f4896c1","Type":"ContainerDied","Data":"f644e8c13a8d8a0b70d73a571330ff156be106f1b600bcbe8b33cfa1ae5aba3b"} Mar 19 12:27:00.920237 master-0 kubenswrapper[29612]: I0319 12:27:00.920165 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 12:27:00.920237 master-0 kubenswrapper[29612]: I0319 12:27:00.920214 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 12:27:01.006966 master-0 kubenswrapper[29612]: I0319 12:27:01.006896 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 12:27:01.470301 master-0 kubenswrapper[29612]: I0319 12:27:01.470240 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.601842 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-dispersionconf\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.602072 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-ring-data-devices\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.602154 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-combined-ca-bundle\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.602198 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddd50353-b1b0-4de2-a180-3461654949ab-etc-swift\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.602263 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-scripts\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.602304 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmgdg\" (UniqueName: \"kubernetes.io/projected/ddd50353-b1b0-4de2-a180-3461654949ab-kube-api-access-qmgdg\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.602359 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-swiftconf\") pod \"ddd50353-b1b0-4de2-a180-3461654949ab\" (UID: \"ddd50353-b1b0-4de2-a180-3461654949ab\") " Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.604265 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.604466 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd50353-b1b0-4de2-a180-3461654949ab-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.607938 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd50353-b1b0-4de2-a180-3461654949ab-kube-api-access-qmgdg" (OuterVolumeSpecName: "kube-api-access-qmgdg") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "kube-api-access-qmgdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:01.611800 master-0 kubenswrapper[29612]: I0319 12:27:01.609742 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:01.626937 master-0 kubenswrapper[29612]: I0319 12:27:01.626881 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-scripts" (OuterVolumeSpecName: "scripts") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:01.628953 master-0 kubenswrapper[29612]: I0319 12:27:01.628893 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:01.631796 master-0 kubenswrapper[29612]: I0319 12:27:01.631702 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ddd50353-b1b0-4de2-a180-3461654949ab" (UID: "ddd50353-b1b0-4de2-a180-3461654949ab"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:01.654369 master-0 kubenswrapper[29612]: I0319 12:27:01.653997 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-69mx9"] Mar 19 12:27:01.654552 master-0 kubenswrapper[29612]: E0319 12:27:01.654451 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd50353-b1b0-4de2-a180-3461654949ab" containerName="swift-ring-rebalance" Mar 19 12:27:01.654552 master-0 kubenswrapper[29612]: I0319 12:27:01.654466 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd50353-b1b0-4de2-a180-3461654949ab" containerName="swift-ring-rebalance" Mar 19 12:27:01.654729 master-0 kubenswrapper[29612]: I0319 12:27:01.654708 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd50353-b1b0-4de2-a180-3461654949ab" containerName="swift-ring-rebalance" Mar 19 12:27:01.655347 master-0 kubenswrapper[29612]: I0319 12:27:01.655317 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.709977 29612 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.710033 29612 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.710046 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.710056 29612 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ddd50353-b1b0-4de2-a180-3461654949ab-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.710067 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ddd50353-b1b0-4de2-a180-3461654949ab-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.710078 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmgdg\" (UniqueName: \"kubernetes.io/projected/ddd50353-b1b0-4de2-a180-3461654949ab-kube-api-access-qmgdg\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.713857 master-0 kubenswrapper[29612]: I0319 12:27:01.710087 29612 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ddd50353-b1b0-4de2-a180-3461654949ab-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.725082 master-0 kubenswrapper[29612]: I0319 12:27:01.724979 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-69mx9"] Mar 19 12:27:01.768133 master-0 kubenswrapper[29612]: I0319 12:27:01.768081 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:27:01.778025 master-0 kubenswrapper[29612]: I0319 12:27:01.777984 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nmkvn" Mar 19 12:27:01.820671 master-0 kubenswrapper[29612]: I0319 12:27:01.817010 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngkgs\" (UniqueName: \"kubernetes.io/projected/68d511e8-9536-451b-bf43-e5660fd5123a-kube-api-access-ngkgs\") pod \"keystone-db-create-69mx9\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.820671 master-0 kubenswrapper[29612]: I0319 12:27:01.817257 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d511e8-9536-451b-bf43-e5660fd5123a-operator-scripts\") pod \"keystone-db-create-69mx9\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.870061 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ca05-account-create-update-cz8p9" event={"ID":"39d30f3c-9ff6-486e-b733-db361f4896c1","Type":"ContainerDied","Data":"4645dec3c339794a200e7d7b3903a5cb498ec1d56726c4c48591d9fcf204fecd"} Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.870109 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4645dec3c339794a200e7d7b3903a5cb498ec1d56726c4c48591d9fcf204fecd" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.870188 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ca05-account-create-update-cz8p9" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.871054 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-17c2-account-create-update-xg5kj"] Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: E0319 12:27:01.871506 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900c7116-e99b-4ca4-b0f3-15956fee5f34" containerName="mariadb-account-create-update" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.871524 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="900c7116-e99b-4ca4-b0f3-15956fee5f34" containerName="mariadb-account-create-update" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: E0319 12:27:01.871586 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d30f3c-9ff6-486e-b733-db361f4896c1" containerName="mariadb-account-create-update" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.871594 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d30f3c-9ff6-486e-b733-db361f4896c1" containerName="mariadb-account-create-update" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.871891 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d30f3c-9ff6-486e-b733-db361f4896c1" containerName="mariadb-account-create-update" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.871911 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="900c7116-e99b-4ca4-b0f3-15956fee5f34" containerName="mariadb-account-create-update" Mar 19 12:27:01.872758 master-0 kubenswrapper[29612]: I0319 12:27:01.872742 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.875531 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nmkvn" event={"ID":"900c7116-e99b-4ca4-b0f3-15956fee5f34","Type":"ContainerDied","Data":"356e82a03e7bd82630edd8a4dd96f8f8757f29e90995e7671d72bb92888757ea"} Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.875579 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356e82a03e7bd82630edd8a4dd96f8f8757f29e90995e7671d72bb92888757ea" Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.875620 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nmkvn" Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.876293 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.877452 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tl6x5" Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.877477 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tl6x5" event={"ID":"ddd50353-b1b0-4de2-a180-3461654949ab","Type":"ContainerDied","Data":"2b1683386e86ffbdc9bdeb81e7b1c87e1166d6a7a11d447c72363b8efb091c08"} Mar 19 12:27:01.878924 master-0 kubenswrapper[29612]: I0319 12:27:01.877492 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1683386e86ffbdc9bdeb81e7b1c87e1166d6a7a11d447c72363b8efb091c08" Mar 19 12:27:01.914985 master-0 kubenswrapper[29612]: I0319 12:27:01.903269 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-17c2-account-create-update-xg5kj"] Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.920938 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hf2k\" (UniqueName: \"kubernetes.io/projected/39d30f3c-9ff6-486e-b733-db361f4896c1-kube-api-access-6hf2k\") pod \"39d30f3c-9ff6-486e-b733-db361f4896c1\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.921042 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c7116-e99b-4ca4-b0f3-15956fee5f34-operator-scripts\") pod \"900c7116-e99b-4ca4-b0f3-15956fee5f34\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.921212 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39d30f3c-9ff6-486e-b733-db361f4896c1-operator-scripts\") pod \"39d30f3c-9ff6-486e-b733-db361f4896c1\" (UID: \"39d30f3c-9ff6-486e-b733-db361f4896c1\") " Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.921321 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45bhs\" (UniqueName: \"kubernetes.io/projected/900c7116-e99b-4ca4-b0f3-15956fee5f34-kube-api-access-45bhs\") pod \"900c7116-e99b-4ca4-b0f3-15956fee5f34\" (UID: \"900c7116-e99b-4ca4-b0f3-15956fee5f34\") " Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.922990 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngkgs\" (UniqueName: \"kubernetes.io/projected/68d511e8-9536-451b-bf43-e5660fd5123a-kube-api-access-ngkgs\") pod \"keystone-db-create-69mx9\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.923872 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d511e8-9536-451b-bf43-e5660fd5123a-operator-scripts\") pod \"keystone-db-create-69mx9\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.924032 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39d30f3c-9ff6-486e-b733-db361f4896c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39d30f3c-9ff6-486e-b733-db361f4896c1" (UID: "39d30f3c-9ff6-486e-b733-db361f4896c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.924098 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900c7116-e99b-4ca4-b0f3-15956fee5f34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "900c7116-e99b-4ca4-b0f3-15956fee5f34" (UID: "900c7116-e99b-4ca4-b0f3-15956fee5f34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.924275 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/900c7116-e99b-4ca4-b0f3-15956fee5f34-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.924297 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39d30f3c-9ff6-486e-b733-db361f4896c1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:01.929844 master-0 kubenswrapper[29612]: I0319 12:27:01.925505 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d511e8-9536-451b-bf43-e5660fd5123a-operator-scripts\") pod \"keystone-db-create-69mx9\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.946748 master-0 kubenswrapper[29612]: I0319 12:27:01.942961 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900c7116-e99b-4ca4-b0f3-15956fee5f34-kube-api-access-45bhs" (OuterVolumeSpecName: "kube-api-access-45bhs") pod "900c7116-e99b-4ca4-b0f3-15956fee5f34" (UID: "900c7116-e99b-4ca4-b0f3-15956fee5f34"). InnerVolumeSpecName "kube-api-access-45bhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:01.964702 master-0 kubenswrapper[29612]: I0319 12:27:01.963083 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d30f3c-9ff6-486e-b733-db361f4896c1-kube-api-access-6hf2k" (OuterVolumeSpecName: "kube-api-access-6hf2k") pod "39d30f3c-9ff6-486e-b733-db361f4896c1" (UID: "39d30f3c-9ff6-486e-b733-db361f4896c1"). InnerVolumeSpecName "kube-api-access-6hf2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:01.969629 master-0 kubenswrapper[29612]: I0319 12:27:01.969358 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngkgs\" (UniqueName: \"kubernetes.io/projected/68d511e8-9536-451b-bf43-e5660fd5123a-kube-api-access-ngkgs\") pod \"keystone-db-create-69mx9\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:01.994228 master-0 kubenswrapper[29612]: I0319 12:27:01.992286 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kb7ng"] Mar 19 12:27:01.998141 master-0 kubenswrapper[29612]: I0319 12:27:01.994659 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.010860 master-0 kubenswrapper[29612]: I0319 12:27:02.010784 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kb7ng"] Mar 19 12:27:02.014581 master-0 kubenswrapper[29612]: I0319 12:27:02.014535 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 12:27:02.027672 master-0 kubenswrapper[29612]: I0319 12:27:02.026312 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40581f33-2061-4f2d-b9b1-13396333b6a4-operator-scripts\") pod \"keystone-17c2-account-create-update-xg5kj\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.027672 master-0 kubenswrapper[29612]: I0319 12:27:02.026581 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vdj9\" (UniqueName: \"kubernetes.io/projected/40581f33-2061-4f2d-b9b1-13396333b6a4-kube-api-access-8vdj9\") pod \"keystone-17c2-account-create-update-xg5kj\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.030920 master-0 kubenswrapper[29612]: I0319 12:27:02.030793 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45bhs\" (UniqueName: \"kubernetes.io/projected/900c7116-e99b-4ca4-b0f3-15956fee5f34-kube-api-access-45bhs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:02.031247 master-0 kubenswrapper[29612]: I0319 12:27:02.031077 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hf2k\" (UniqueName: \"kubernetes.io/projected/39d30f3c-9ff6-486e-b733-db361f4896c1-kube-api-access-6hf2k\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:02.090033 master-0 kubenswrapper[29612]: I0319 12:27:02.089953 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c0f4-account-create-update-vlptl"] Mar 19 12:27:02.092038 master-0 kubenswrapper[29612]: I0319 12:27:02.091977 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:02.093536 master-0 kubenswrapper[29612]: I0319 12:27:02.093505 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.100114 master-0 kubenswrapper[29612]: I0319 12:27:02.099949 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 12:27:02.134042 master-0 kubenswrapper[29612]: I0319 12:27:02.133466 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40581f33-2061-4f2d-b9b1-13396333b6a4-operator-scripts\") pod \"keystone-17c2-account-create-update-xg5kj\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.134042 master-0 kubenswrapper[29612]: I0319 12:27:02.133612 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vdj9\" (UniqueName: \"kubernetes.io/projected/40581f33-2061-4f2d-b9b1-13396333b6a4-kube-api-access-8vdj9\") pod \"keystone-17c2-account-create-update-xg5kj\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.134042 master-0 kubenswrapper[29612]: I0319 12:27:02.133747 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndzrx\" (UniqueName: \"kubernetes.io/projected/d7c1e578-52c8-42f8-a392-a1913a2ee93e-kube-api-access-ndzrx\") pod \"placement-db-create-kb7ng\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.134042 master-0 kubenswrapper[29612]: I0319 12:27:02.133777 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c1e578-52c8-42f8-a392-a1913a2ee93e-operator-scripts\") pod \"placement-db-create-kb7ng\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.136525 master-0 kubenswrapper[29612]: I0319 12:27:02.136492 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40581f33-2061-4f2d-b9b1-13396333b6a4-operator-scripts\") pod \"keystone-17c2-account-create-update-xg5kj\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.146298 master-0 kubenswrapper[29612]: I0319 12:27:02.146133 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c0f4-account-create-update-vlptl"] Mar 19 12:27:02.159161 master-0 kubenswrapper[29612]: I0319 12:27:02.159084 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vdj9\" (UniqueName: \"kubernetes.io/projected/40581f33-2061-4f2d-b9b1-13396333b6a4-kube-api-access-8vdj9\") pod \"keystone-17c2-account-create-update-xg5kj\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.214311 master-0 kubenswrapper[29612]: I0319 12:27:02.214261 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:02.235877 master-0 kubenswrapper[29612]: I0319 12:27:02.235744 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjpll\" (UniqueName: \"kubernetes.io/projected/12242f2c-55d1-422d-a775-5688b8dcb092-kube-api-access-hjpll\") pod \"placement-c0f4-account-create-update-vlptl\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.236279 master-0 kubenswrapper[29612]: I0319 12:27:02.236224 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12242f2c-55d1-422d-a775-5688b8dcb092-operator-scripts\") pod \"placement-c0f4-account-create-update-vlptl\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.236525 master-0 kubenswrapper[29612]: I0319 12:27:02.236503 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndzrx\" (UniqueName: \"kubernetes.io/projected/d7c1e578-52c8-42f8-a392-a1913a2ee93e-kube-api-access-ndzrx\") pod \"placement-db-create-kb7ng\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.236604 master-0 kubenswrapper[29612]: I0319 12:27:02.236540 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c1e578-52c8-42f8-a392-a1913a2ee93e-operator-scripts\") pod \"placement-db-create-kb7ng\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.237385 master-0 kubenswrapper[29612]: I0319 12:27:02.237346 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c1e578-52c8-42f8-a392-a1913a2ee93e-operator-scripts\") pod \"placement-db-create-kb7ng\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.255539 master-0 kubenswrapper[29612]: I0319 12:27:02.255245 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndzrx\" (UniqueName: \"kubernetes.io/projected/d7c1e578-52c8-42f8-a392-a1913a2ee93e-kube-api-access-ndzrx\") pod \"placement-db-create-kb7ng\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.336505 master-0 kubenswrapper[29612]: I0319 12:27:02.336433 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:02.339190 master-0 kubenswrapper[29612]: I0319 12:27:02.338774 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12242f2c-55d1-422d-a775-5688b8dcb092-operator-scripts\") pod \"placement-c0f4-account-create-update-vlptl\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.339190 master-0 kubenswrapper[29612]: I0319 12:27:02.338961 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjpll\" (UniqueName: \"kubernetes.io/projected/12242f2c-55d1-422d-a775-5688b8dcb092-kube-api-access-hjpll\") pod \"placement-c0f4-account-create-update-vlptl\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.340270 master-0 kubenswrapper[29612]: I0319 12:27:02.340123 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12242f2c-55d1-422d-a775-5688b8dcb092-operator-scripts\") pod \"placement-c0f4-account-create-update-vlptl\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.366290 master-0 kubenswrapper[29612]: I0319 12:27:02.366146 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjpll\" (UniqueName: \"kubernetes.io/projected/12242f2c-55d1-422d-a775-5688b8dcb092-kube-api-access-hjpll\") pod \"placement-c0f4-account-create-update-vlptl\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.428740 master-0 kubenswrapper[29612]: I0319 12:27:02.428530 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:02.620855 master-0 kubenswrapper[29612]: I0319 12:27:02.620815 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-69mx9"] Mar 19 12:27:02.769214 master-0 kubenswrapper[29612]: I0319 12:27:02.768919 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-17c2-account-create-update-xg5kj"] Mar 19 12:27:02.885857 master-0 kubenswrapper[29612]: I0319 12:27:02.885797 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kb7ng"] Mar 19 12:27:02.896667 master-0 kubenswrapper[29612]: I0319 12:27:02.893662 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-17c2-account-create-update-xg5kj" event={"ID":"40581f33-2061-4f2d-b9b1-13396333b6a4","Type":"ContainerStarted","Data":"72f51282bf750b6d4746e540c89256263a9a5f0f30472bdc149baed11b633224"} Mar 19 12:27:02.899651 master-0 kubenswrapper[29612]: I0319 12:27:02.899567 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69mx9" event={"ID":"68d511e8-9536-451b-bf43-e5660fd5123a","Type":"ContainerStarted","Data":"68328a25efdc74fd0c62ffb7d6f01edc12680b7e6848ae041bd2b3c2c80fb763"} Mar 19 12:27:02.899726 master-0 kubenswrapper[29612]: I0319 12:27:02.899655 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69mx9" event={"ID":"68d511e8-9536-451b-bf43-e5660fd5123a","Type":"ContainerStarted","Data":"dbc491299871137c3af87910a07d8279f683d2a7557367c06b508b4613da1d5e"} Mar 19 12:27:02.955055 master-0 kubenswrapper[29612]: I0319 12:27:02.954974 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-69mx9" podStartSLOduration=1.9549556030000002 podStartE2EDuration="1.954955603s" podCreationTimestamp="2026-03-19 12:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:27:02.92672941 +0000 UTC m=+1030.240898431" watchObservedRunningTime="2026-03-19 12:27:02.954955603 +0000 UTC m=+1030.269124624" Mar 19 12:27:03.049747 master-0 kubenswrapper[29612]: I0319 12:27:03.049062 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c0f4-account-create-update-vlptl"] Mar 19 12:27:03.061032 master-0 kubenswrapper[29612]: W0319 12:27:03.059978 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12242f2c_55d1_422d_a775_5688b8dcb092.slice/crio-4f3ac5e669bf02a676386a8506a9109d134ce2911c246b9b09b4b167d634614c WatchSource:0}: Error finding container 4f3ac5e669bf02a676386a8506a9109d134ce2911c246b9b09b4b167d634614c: Status 404 returned error can't find the container with id 4f3ac5e669bf02a676386a8506a9109d134ce2911c246b9b09b4b167d634614c Mar 19 12:27:03.917436 master-0 kubenswrapper[29612]: I0319 12:27:03.917386 29612 generic.go:334] "Generic (PLEG): container finished" podID="d7c1e578-52c8-42f8-a392-a1913a2ee93e" containerID="10ced7228a6cd4213fbbe0dd9fb136f6adae3a9234edf193798b6f211b5373e7" exitCode=0 Mar 19 12:27:03.917732 master-0 kubenswrapper[29612]: I0319 12:27:03.917466 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kb7ng" event={"ID":"d7c1e578-52c8-42f8-a392-a1913a2ee93e","Type":"ContainerDied","Data":"10ced7228a6cd4213fbbe0dd9fb136f6adae3a9234edf193798b6f211b5373e7"} Mar 19 12:27:03.917732 master-0 kubenswrapper[29612]: I0319 12:27:03.917500 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kb7ng" event={"ID":"d7c1e578-52c8-42f8-a392-a1913a2ee93e","Type":"ContainerStarted","Data":"62a048f2174455191bb2a14ff0c314dd45318ce32dfbf5ab5f58013ac4c458b9"} Mar 19 12:27:03.920904 master-0 kubenswrapper[29612]: I0319 12:27:03.920854 29612 generic.go:334] "Generic (PLEG): container finished" podID="12242f2c-55d1-422d-a775-5688b8dcb092" containerID="3923e303eb586757e5d02d43f6a54cbb32cc01575e765ed37602847d810d383c" exitCode=0 Mar 19 12:27:03.921056 master-0 kubenswrapper[29612]: I0319 12:27:03.920924 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c0f4-account-create-update-vlptl" event={"ID":"12242f2c-55d1-422d-a775-5688b8dcb092","Type":"ContainerDied","Data":"3923e303eb586757e5d02d43f6a54cbb32cc01575e765ed37602847d810d383c"} Mar 19 12:27:03.921056 master-0 kubenswrapper[29612]: I0319 12:27:03.920952 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c0f4-account-create-update-vlptl" event={"ID":"12242f2c-55d1-422d-a775-5688b8dcb092","Type":"ContainerStarted","Data":"4f3ac5e669bf02a676386a8506a9109d134ce2911c246b9b09b4b167d634614c"} Mar 19 12:27:03.924210 master-0 kubenswrapper[29612]: I0319 12:27:03.924151 29612 generic.go:334] "Generic (PLEG): container finished" podID="40581f33-2061-4f2d-b9b1-13396333b6a4" containerID="f6b4fba7b763c2ca76cb4bf0055f85e8b7b519e13383fec4c9c5fb142b31b6f2" exitCode=0 Mar 19 12:27:03.924394 master-0 kubenswrapper[29612]: I0319 12:27:03.924260 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-17c2-account-create-update-xg5kj" event={"ID":"40581f33-2061-4f2d-b9b1-13396333b6a4","Type":"ContainerDied","Data":"f6b4fba7b763c2ca76cb4bf0055f85e8b7b519e13383fec4c9c5fb142b31b6f2"} Mar 19 12:27:03.926840 master-0 kubenswrapper[29612]: I0319 12:27:03.926782 29612 generic.go:334] "Generic (PLEG): container finished" podID="68d511e8-9536-451b-bf43-e5660fd5123a" containerID="68328a25efdc74fd0c62ffb7d6f01edc12680b7e6848ae041bd2b3c2c80fb763" exitCode=0 Mar 19 12:27:03.927051 master-0 kubenswrapper[29612]: I0319 12:27:03.927012 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69mx9" event={"ID":"68d511e8-9536-451b-bf43-e5660fd5123a","Type":"ContainerDied","Data":"68328a25efdc74fd0c62ffb7d6f01edc12680b7e6848ae041bd2b3c2c80fb763"} Mar 19 12:27:05.440625 master-0 kubenswrapper[29612]: I0319 12:27:05.440588 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:05.518482 master-0 kubenswrapper[29612]: I0319 12:27:05.518396 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-q8b25"] Mar 19 12:27:05.519347 master-0 kubenswrapper[29612]: E0319 12:27:05.519305 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c1e578-52c8-42f8-a392-a1913a2ee93e" containerName="mariadb-database-create" Mar 19 12:27:05.519347 master-0 kubenswrapper[29612]: I0319 12:27:05.519335 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c1e578-52c8-42f8-a392-a1913a2ee93e" containerName="mariadb-database-create" Mar 19 12:27:05.519590 master-0 kubenswrapper[29612]: I0319 12:27:05.519559 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c1e578-52c8-42f8-a392-a1913a2ee93e" containerName="mariadb-database-create" Mar 19 12:27:05.520413 master-0 kubenswrapper[29612]: I0319 12:27:05.520372 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.520852 master-0 kubenswrapper[29612]: I0319 12:27:05.520816 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndzrx\" (UniqueName: \"kubernetes.io/projected/d7c1e578-52c8-42f8-a392-a1913a2ee93e-kube-api-access-ndzrx\") pod \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " Mar 19 12:27:05.520937 master-0 kubenswrapper[29612]: I0319 12:27:05.520924 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c1e578-52c8-42f8-a392-a1913a2ee93e-operator-scripts\") pod \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\" (UID: \"d7c1e578-52c8-42f8-a392-a1913a2ee93e\") " Mar 19 12:27:05.522005 master-0 kubenswrapper[29612]: I0319 12:27:05.521940 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c1e578-52c8-42f8-a392-a1913a2ee93e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7c1e578-52c8-42f8-a392-a1913a2ee93e" (UID: "d7c1e578-52c8-42f8-a392-a1913a2ee93e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:05.527863 master-0 kubenswrapper[29612]: I0319 12:27:05.527535 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-q8b25"] Mar 19 12:27:05.528574 master-0 kubenswrapper[29612]: I0319 12:27:05.528512 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c1e578-52c8-42f8-a392-a1913a2ee93e-kube-api-access-ndzrx" (OuterVolumeSpecName: "kube-api-access-ndzrx") pod "d7c1e578-52c8-42f8-a392-a1913a2ee93e" (UID: "d7c1e578-52c8-42f8-a392-a1913a2ee93e"). InnerVolumeSpecName "kube-api-access-ndzrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:05.633698 master-0 kubenswrapper[29612]: I0319 12:27:05.633597 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae27604-eeed-4c63-9655-46beecfe4168-operator-scripts\") pod \"glance-db-create-q8b25\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.633947 master-0 kubenswrapper[29612]: I0319 12:27:05.633758 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7kq\" (UniqueName: \"kubernetes.io/projected/dae27604-eeed-4c63-9655-46beecfe4168-kube-api-access-jn7kq\") pod \"glance-db-create-q8b25\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.633947 master-0 kubenswrapper[29612]: I0319 12:27:05.633862 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndzrx\" (UniqueName: \"kubernetes.io/projected/d7c1e578-52c8-42f8-a392-a1913a2ee93e-kube-api-access-ndzrx\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:05.633947 master-0 kubenswrapper[29612]: I0319 12:27:05.633879 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c1e578-52c8-42f8-a392-a1913a2ee93e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:05.736655 master-0 kubenswrapper[29612]: I0319 12:27:05.736041 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae27604-eeed-4c63-9655-46beecfe4168-operator-scripts\") pod \"glance-db-create-q8b25\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.736655 master-0 kubenswrapper[29612]: I0319 12:27:05.736242 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7kq\" (UniqueName: \"kubernetes.io/projected/dae27604-eeed-4c63-9655-46beecfe4168-kube-api-access-jn7kq\") pod \"glance-db-create-q8b25\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.740649 master-0 kubenswrapper[29612]: I0319 12:27:05.737657 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae27604-eeed-4c63-9655-46beecfe4168-operator-scripts\") pod \"glance-db-create-q8b25\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.759321 master-0 kubenswrapper[29612]: I0319 12:27:05.759272 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7kq\" (UniqueName: \"kubernetes.io/projected/dae27604-eeed-4c63-9655-46beecfe4168-kube-api-access-jn7kq\") pod \"glance-db-create-q8b25\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.853004 master-0 kubenswrapper[29612]: I0319 12:27:05.852932 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:05.862567 master-0 kubenswrapper[29612]: I0319 12:27:05.862504 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:05.867853 master-0 kubenswrapper[29612]: I0319 12:27:05.867806 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8b25" Mar 19 12:27:05.883919 master-0 kubenswrapper[29612]: I0319 12:27:05.881542 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:05.939117 master-0 kubenswrapper[29612]: I0319 12:27:05.938571 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vdj9\" (UniqueName: \"kubernetes.io/projected/40581f33-2061-4f2d-b9b1-13396333b6a4-kube-api-access-8vdj9\") pod \"40581f33-2061-4f2d-b9b1-13396333b6a4\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " Mar 19 12:27:05.939117 master-0 kubenswrapper[29612]: I0319 12:27:05.938657 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40581f33-2061-4f2d-b9b1-13396333b6a4-operator-scripts\") pod \"40581f33-2061-4f2d-b9b1-13396333b6a4\" (UID: \"40581f33-2061-4f2d-b9b1-13396333b6a4\") " Mar 19 12:27:05.939117 master-0 kubenswrapper[29612]: I0319 12:27:05.938705 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d511e8-9536-451b-bf43-e5660fd5123a-operator-scripts\") pod \"68d511e8-9536-451b-bf43-e5660fd5123a\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " Mar 19 12:27:05.939800 master-0 kubenswrapper[29612]: I0319 12:27:05.939769 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjpll\" (UniqueName: \"kubernetes.io/projected/12242f2c-55d1-422d-a775-5688b8dcb092-kube-api-access-hjpll\") pod \"12242f2c-55d1-422d-a775-5688b8dcb092\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " Mar 19 12:27:05.939956 master-0 kubenswrapper[29612]: I0319 12:27:05.939934 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngkgs\" (UniqueName: \"kubernetes.io/projected/68d511e8-9536-451b-bf43-e5660fd5123a-kube-api-access-ngkgs\") pod \"68d511e8-9536-451b-bf43-e5660fd5123a\" (UID: \"68d511e8-9536-451b-bf43-e5660fd5123a\") " Mar 19 12:27:05.940034 master-0 kubenswrapper[29612]: I0319 12:27:05.940016 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12242f2c-55d1-422d-a775-5688b8dcb092-operator-scripts\") pod \"12242f2c-55d1-422d-a775-5688b8dcb092\" (UID: \"12242f2c-55d1-422d-a775-5688b8dcb092\") " Mar 19 12:27:05.940658 master-0 kubenswrapper[29612]: I0319 12:27:05.940593 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d511e8-9536-451b-bf43-e5660fd5123a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68d511e8-9536-451b-bf43-e5660fd5123a" (UID: "68d511e8-9536-451b-bf43-e5660fd5123a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:05.941226 master-0 kubenswrapper[29612]: I0319 12:27:05.941191 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40581f33-2061-4f2d-b9b1-13396333b6a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40581f33-2061-4f2d-b9b1-13396333b6a4" (UID: "40581f33-2061-4f2d-b9b1-13396333b6a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:05.941370 master-0 kubenswrapper[29612]: I0319 12:27:05.941337 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40581f33-2061-4f2d-b9b1-13396333b6a4-kube-api-access-8vdj9" (OuterVolumeSpecName: "kube-api-access-8vdj9") pod "40581f33-2061-4f2d-b9b1-13396333b6a4" (UID: "40581f33-2061-4f2d-b9b1-13396333b6a4"). InnerVolumeSpecName "kube-api-access-8vdj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:05.941764 master-0 kubenswrapper[29612]: I0319 12:27:05.941736 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vdj9\" (UniqueName: \"kubernetes.io/projected/40581f33-2061-4f2d-b9b1-13396333b6a4-kube-api-access-8vdj9\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:05.941840 master-0 kubenswrapper[29612]: I0319 12:27:05.941768 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40581f33-2061-4f2d-b9b1-13396333b6a4-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:05.941840 master-0 kubenswrapper[29612]: I0319 12:27:05.941783 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d511e8-9536-451b-bf43-e5660fd5123a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:05.942733 master-0 kubenswrapper[29612]: I0319 12:27:05.942701 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12242f2c-55d1-422d-a775-5688b8dcb092-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12242f2c-55d1-422d-a775-5688b8dcb092" (UID: "12242f2c-55d1-422d-a775-5688b8dcb092"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:05.943404 master-0 kubenswrapper[29612]: I0319 12:27:05.943369 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d511e8-9536-451b-bf43-e5660fd5123a-kube-api-access-ngkgs" (OuterVolumeSpecName: "kube-api-access-ngkgs") pod "68d511e8-9536-451b-bf43-e5660fd5123a" (UID: "68d511e8-9536-451b-bf43-e5660fd5123a"). InnerVolumeSpecName "kube-api-access-ngkgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:05.946870 master-0 kubenswrapper[29612]: I0319 12:27:05.946801 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12242f2c-55d1-422d-a775-5688b8dcb092-kube-api-access-hjpll" (OuterVolumeSpecName: "kube-api-access-hjpll") pod "12242f2c-55d1-422d-a775-5688b8dcb092" (UID: "12242f2c-55d1-422d-a775-5688b8dcb092"). InnerVolumeSpecName "kube-api-access-hjpll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:05.962788 master-0 kubenswrapper[29612]: I0319 12:27:05.961572 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c0f4-account-create-update-vlptl" Mar 19 12:27:05.962788 master-0 kubenswrapper[29612]: I0319 12:27:05.961590 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c0f4-account-create-update-vlptl" event={"ID":"12242f2c-55d1-422d-a775-5688b8dcb092","Type":"ContainerDied","Data":"4f3ac5e669bf02a676386a8506a9109d134ce2911c246b9b09b4b167d634614c"} Mar 19 12:27:05.962788 master-0 kubenswrapper[29612]: I0319 12:27:05.961656 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f3ac5e669bf02a676386a8506a9109d134ce2911c246b9b09b4b167d634614c" Mar 19 12:27:05.967339 master-0 kubenswrapper[29612]: I0319 12:27:05.967288 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-17c2-account-create-update-xg5kj" event={"ID":"40581f33-2061-4f2d-b9b1-13396333b6a4","Type":"ContainerDied","Data":"72f51282bf750b6d4746e540c89256263a9a5f0f30472bdc149baed11b633224"} Mar 19 12:27:05.967339 master-0 kubenswrapper[29612]: I0319 12:27:05.967320 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f51282bf750b6d4746e540c89256263a9a5f0f30472bdc149baed11b633224" Mar 19 12:27:05.967587 master-0 kubenswrapper[29612]: I0319 12:27:05.967387 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-17c2-account-create-update-xg5kj" Mar 19 12:27:05.969533 master-0 kubenswrapper[29612]: I0319 12:27:05.969491 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-69mx9" event={"ID":"68d511e8-9536-451b-bf43-e5660fd5123a","Type":"ContainerDied","Data":"dbc491299871137c3af87910a07d8279f683d2a7557367c06b508b4613da1d5e"} Mar 19 12:27:05.969533 master-0 kubenswrapper[29612]: I0319 12:27:05.969517 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbc491299871137c3af87910a07d8279f683d2a7557367c06b508b4613da1d5e" Mar 19 12:27:05.969776 master-0 kubenswrapper[29612]: I0319 12:27:05.969564 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-69mx9" Mar 19 12:27:05.975130 master-0 kubenswrapper[29612]: I0319 12:27:05.975061 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kb7ng" event={"ID":"d7c1e578-52c8-42f8-a392-a1913a2ee93e","Type":"ContainerDied","Data":"62a048f2174455191bb2a14ff0c314dd45318ce32dfbf5ab5f58013ac4c458b9"} Mar 19 12:27:05.975130 master-0 kubenswrapper[29612]: I0319 12:27:05.975124 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62a048f2174455191bb2a14ff0c314dd45318ce32dfbf5ab5f58013ac4c458b9" Mar 19 12:27:05.975461 master-0 kubenswrapper[29612]: I0319 12:27:05.975211 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kb7ng" Mar 19 12:27:06.043192 master-0 kubenswrapper[29612]: I0319 12:27:06.043120 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjpll\" (UniqueName: \"kubernetes.io/projected/12242f2c-55d1-422d-a775-5688b8dcb092-kube-api-access-hjpll\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:06.043192 master-0 kubenswrapper[29612]: I0319 12:27:06.043166 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngkgs\" (UniqueName: \"kubernetes.io/projected/68d511e8-9536-451b-bf43-e5660fd5123a-kube-api-access-ngkgs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:06.043192 master-0 kubenswrapper[29612]: I0319 12:27:06.043177 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12242f2c-55d1-422d-a775-5688b8dcb092-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:06.414765 master-0 kubenswrapper[29612]: W0319 12:27:06.414688 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae27604_eeed_4c63_9655_46beecfe4168.slice/crio-fc9a4826a3890c741b42389fb4244278e784ca9dc9998a53fbc08431924e6ad3 WatchSource:0}: Error finding container fc9a4826a3890c741b42389fb4244278e784ca9dc9998a53fbc08431924e6ad3: Status 404 returned error can't find the container with id fc9a4826a3890c741b42389fb4244278e784ca9dc9998a53fbc08431924e6ad3 Mar 19 12:27:06.416894 master-0 kubenswrapper[29612]: I0319 12:27:06.416841 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-q8b25"] Mar 19 12:27:06.987811 master-0 kubenswrapper[29612]: I0319 12:27:06.987742 29612 generic.go:334] "Generic (PLEG): container finished" podID="dae27604-eeed-4c63-9655-46beecfe4168" containerID="a8c719d8f4019912aca82739afdea74a1cb610df66db9772d57d93b5ae44b8a3" exitCode=0 Mar 19 12:27:06.987811 master-0 kubenswrapper[29612]: I0319 12:27:06.987809 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8b25" event={"ID":"dae27604-eeed-4c63-9655-46beecfe4168","Type":"ContainerDied","Data":"a8c719d8f4019912aca82739afdea74a1cb610df66db9772d57d93b5ae44b8a3"} Mar 19 12:27:06.988417 master-0 kubenswrapper[29612]: I0319 12:27:06.987851 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8b25" event={"ID":"dae27604-eeed-4c63-9655-46beecfe4168","Type":"ContainerStarted","Data":"fc9a4826a3890c741b42389fb4244278e784ca9dc9998a53fbc08431924e6ad3"} Mar 19 12:27:08.382518 master-0 kubenswrapper[29612]: I0319 12:27:08.382305 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nmkvn"] Mar 19 12:27:08.404982 master-0 kubenswrapper[29612]: I0319 12:27:08.401405 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nmkvn"] Mar 19 12:27:08.416453 master-0 kubenswrapper[29612]: I0319 12:27:08.416386 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-h45hg"] Mar 19 12:27:08.416940 master-0 kubenswrapper[29612]: E0319 12:27:08.416910 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40581f33-2061-4f2d-b9b1-13396333b6a4" containerName="mariadb-account-create-update" Mar 19 12:27:08.416940 master-0 kubenswrapper[29612]: I0319 12:27:08.416931 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="40581f33-2061-4f2d-b9b1-13396333b6a4" containerName="mariadb-account-create-update" Mar 19 12:27:08.417055 master-0 kubenswrapper[29612]: E0319 12:27:08.416949 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d511e8-9536-451b-bf43-e5660fd5123a" containerName="mariadb-database-create" Mar 19 12:27:08.417055 master-0 kubenswrapper[29612]: I0319 12:27:08.416957 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d511e8-9536-451b-bf43-e5660fd5123a" containerName="mariadb-database-create" Mar 19 12:27:08.417055 master-0 kubenswrapper[29612]: E0319 12:27:08.416985 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12242f2c-55d1-422d-a775-5688b8dcb092" containerName="mariadb-account-create-update" Mar 19 12:27:08.417055 master-0 kubenswrapper[29612]: I0319 12:27:08.416991 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="12242f2c-55d1-422d-a775-5688b8dcb092" containerName="mariadb-account-create-update" Mar 19 12:27:08.417223 master-0 kubenswrapper[29612]: I0319 12:27:08.417195 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="12242f2c-55d1-422d-a775-5688b8dcb092" containerName="mariadb-account-create-update" Mar 19 12:27:08.417267 master-0 kubenswrapper[29612]: I0319 12:27:08.417245 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="40581f33-2061-4f2d-b9b1-13396333b6a4" containerName="mariadb-account-create-update" Mar 19 12:27:08.417267 master-0 kubenswrapper[29612]: I0319 12:27:08.417265 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d511e8-9536-451b-bf43-e5660fd5123a" containerName="mariadb-database-create" Mar 19 12:27:08.417971 master-0 kubenswrapper[29612]: I0319 12:27:08.417940 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.430660 master-0 kubenswrapper[29612]: I0319 12:27:08.428177 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h45hg"] Mar 19 12:27:08.434195 master-0 kubenswrapper[29612]: I0319 12:27:08.434138 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 12:27:08.446690 master-0 kubenswrapper[29612]: I0319 12:27:08.446587 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-operator-scripts\") pod \"root-account-create-update-h45hg\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.447154 master-0 kubenswrapper[29612]: I0319 12:27:08.447100 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbxlf\" (UniqueName: \"kubernetes.io/projected/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-kube-api-access-fbxlf\") pod \"root-account-create-update-h45hg\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.519242 master-0 kubenswrapper[29612]: I0319 12:27:08.519183 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8b25" Mar 19 12:27:08.552531 master-0 kubenswrapper[29612]: I0319 12:27:08.549213 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn7kq\" (UniqueName: \"kubernetes.io/projected/dae27604-eeed-4c63-9655-46beecfe4168-kube-api-access-jn7kq\") pod \"dae27604-eeed-4c63-9655-46beecfe4168\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " Mar 19 12:27:08.553130 master-0 kubenswrapper[29612]: I0319 12:27:08.553067 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae27604-eeed-4c63-9655-46beecfe4168-kube-api-access-jn7kq" (OuterVolumeSpecName: "kube-api-access-jn7kq") pod "dae27604-eeed-4c63-9655-46beecfe4168" (UID: "dae27604-eeed-4c63-9655-46beecfe4168"). InnerVolumeSpecName "kube-api-access-jn7kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:08.553180 master-0 kubenswrapper[29612]: I0319 12:27:08.553141 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbxlf\" (UniqueName: \"kubernetes.io/projected/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-kube-api-access-fbxlf\") pod \"root-account-create-update-h45hg\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.553256 master-0 kubenswrapper[29612]: I0319 12:27:08.553233 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-operator-scripts\") pod \"root-account-create-update-h45hg\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.554015 master-0 kubenswrapper[29612]: I0319 12:27:08.553971 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-operator-scripts\") pod \"root-account-create-update-h45hg\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.554782 master-0 kubenswrapper[29612]: I0319 12:27:08.554738 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn7kq\" (UniqueName: \"kubernetes.io/projected/dae27604-eeed-4c63-9655-46beecfe4168-kube-api-access-jn7kq\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:08.572306 master-0 kubenswrapper[29612]: I0319 12:27:08.572245 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbxlf\" (UniqueName: \"kubernetes.io/projected/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-kube-api-access-fbxlf\") pod \"root-account-create-update-h45hg\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.656040 master-0 kubenswrapper[29612]: I0319 12:27:08.655939 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae27604-eeed-4c63-9655-46beecfe4168-operator-scripts\") pod \"dae27604-eeed-4c63-9655-46beecfe4168\" (UID: \"dae27604-eeed-4c63-9655-46beecfe4168\") " Mar 19 12:27:08.656628 master-0 kubenswrapper[29612]: I0319 12:27:08.656552 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae27604-eeed-4c63-9655-46beecfe4168-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dae27604-eeed-4c63-9655-46beecfe4168" (UID: "dae27604-eeed-4c63-9655-46beecfe4168"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:08.656916 master-0 kubenswrapper[29612]: I0319 12:27:08.656869 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dae27604-eeed-4c63-9655-46beecfe4168-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:08.819999 master-0 kubenswrapper[29612]: I0319 12:27:08.819921 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:08.921511 master-0 kubenswrapper[29612]: I0319 12:27:08.921430 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900c7116-e99b-4ca4-b0f3-15956fee5f34" path="/var/lib/kubelet/pods/900c7116-e99b-4ca4-b0f3-15956fee5f34/volumes" Mar 19 12:27:09.017421 master-0 kubenswrapper[29612]: I0319 12:27:09.014900 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8b25" event={"ID":"dae27604-eeed-4c63-9655-46beecfe4168","Type":"ContainerDied","Data":"fc9a4826a3890c741b42389fb4244278e784ca9dc9998a53fbc08431924e6ad3"} Mar 19 12:27:09.017421 master-0 kubenswrapper[29612]: I0319 12:27:09.014947 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9a4826a3890c741b42389fb4244278e784ca9dc9998a53fbc08431924e6ad3" Mar 19 12:27:09.017421 master-0 kubenswrapper[29612]: I0319 12:27:09.015006 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8b25" Mar 19 12:27:09.321224 master-0 kubenswrapper[29612]: I0319 12:27:09.321058 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h45hg"] Mar 19 12:27:10.025155 master-0 kubenswrapper[29612]: I0319 12:27:10.025098 29612 generic.go:334] "Generic (PLEG): container finished" podID="d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" containerID="d0e34573d1f49c245d0f7bd53aec0a9fc6f853283035dc3404d4679fa5713f0d" exitCode=0 Mar 19 12:27:10.025730 master-0 kubenswrapper[29612]: I0319 12:27:10.025157 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h45hg" event={"ID":"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3","Type":"ContainerDied","Data":"d0e34573d1f49c245d0f7bd53aec0a9fc6f853283035dc3404d4679fa5713f0d"} Mar 19 12:27:10.025730 master-0 kubenswrapper[29612]: I0319 12:27:10.025189 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h45hg" event={"ID":"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3","Type":"ContainerStarted","Data":"de90632974ae0d7301c28845114f80bb03948ad99419cb06cb0c6a5ed7349063"} Mar 19 12:27:10.787571 master-0 kubenswrapper[29612]: I0319 12:27:10.787500 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7vbjj"] Mar 19 12:27:10.788216 master-0 kubenswrapper[29612]: E0319 12:27:10.788172 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae27604-eeed-4c63-9655-46beecfe4168" containerName="mariadb-database-create" Mar 19 12:27:10.788216 master-0 kubenswrapper[29612]: I0319 12:27:10.788204 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae27604-eeed-4c63-9655-46beecfe4168" containerName="mariadb-database-create" Mar 19 12:27:10.788605 master-0 kubenswrapper[29612]: I0319 12:27:10.788578 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae27604-eeed-4c63-9655-46beecfe4168" containerName="mariadb-database-create" Mar 19 12:27:10.789577 master-0 kubenswrapper[29612]: I0319 12:27:10.789526 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.791915 master-0 kubenswrapper[29612]: I0319 12:27:10.791882 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-config-data" Mar 19 12:27:10.805174 master-0 kubenswrapper[29612]: I0319 12:27:10.805083 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-combined-ca-bundle\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.805363 master-0 kubenswrapper[29612]: I0319 12:27:10.805246 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tftrv\" (UniqueName: \"kubernetes.io/projected/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-kube-api-access-tftrv\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.805363 master-0 kubenswrapper[29612]: I0319 12:27:10.805335 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-config-data\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.805505 master-0 kubenswrapper[29612]: I0319 12:27:10.805372 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-db-sync-config-data\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.907866 master-0 kubenswrapper[29612]: I0319 12:27:10.907053 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-combined-ca-bundle\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.908307 master-0 kubenswrapper[29612]: I0319 12:27:10.908283 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tftrv\" (UniqueName: \"kubernetes.io/projected/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-kube-api-access-tftrv\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.913617 master-0 kubenswrapper[29612]: I0319 12:27:10.908494 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-config-data\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.913617 master-0 kubenswrapper[29612]: I0319 12:27:10.909119 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-db-sync-config-data\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.921707 master-0 kubenswrapper[29612]: I0319 12:27:10.921536 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-config-data\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.927159 master-0 kubenswrapper[29612]: I0319 12:27:10.922703 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-combined-ca-bundle\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.927159 master-0 kubenswrapper[29612]: I0319 12:27:10.922828 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-db-sync-config-data\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:10.949912 master-0 kubenswrapper[29612]: I0319 12:27:10.949856 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7vbjj"] Mar 19 12:27:10.951984 master-0 kubenswrapper[29612]: I0319 12:27:10.951932 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tftrv\" (UniqueName: \"kubernetes.io/projected/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-kube-api-access-tftrv\") pod \"glance-db-sync-7vbjj\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:11.109313 master-0 kubenswrapper[29612]: I0319 12:27:11.109166 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:11.492018 master-0 kubenswrapper[29612]: I0319 12:27:11.491957 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:11.627524 master-0 kubenswrapper[29612]: I0319 12:27:11.627445 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-operator-scripts\") pod \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " Mar 19 12:27:11.627761 master-0 kubenswrapper[29612]: I0319 12:27:11.627610 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbxlf\" (UniqueName: \"kubernetes.io/projected/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-kube-api-access-fbxlf\") pod \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\" (UID: \"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3\") " Mar 19 12:27:11.629246 master-0 kubenswrapper[29612]: I0319 12:27:11.627898 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" (UID: "d0737ffd-e31d-44b8-a26f-6d8c0f222bd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:11.629246 master-0 kubenswrapper[29612]: I0319 12:27:11.628071 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:11.631426 master-0 kubenswrapper[29612]: I0319 12:27:11.631393 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-kube-api-access-fbxlf" (OuterVolumeSpecName: "kube-api-access-fbxlf") pod "d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" (UID: "d0737ffd-e31d-44b8-a26f-6d8c0f222bd3"). InnerVolumeSpecName "kube-api-access-fbxlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:11.700706 master-0 kubenswrapper[29612]: I0319 12:27:11.700545 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7vbjj"] Mar 19 12:27:11.702320 master-0 kubenswrapper[29612]: W0319 12:27:11.702268 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16a901b3_60ab_4fe0_ab8b_49d5b7c30b54.slice/crio-58733214d62e6e6a42b71277b5a7ed4a76028cc679769aed8b32530247284762 WatchSource:0}: Error finding container 58733214d62e6e6a42b71277b5a7ed4a76028cc679769aed8b32530247284762: Status 404 returned error can't find the container with id 58733214d62e6e6a42b71277b5a7ed4a76028cc679769aed8b32530247284762 Mar 19 12:27:11.731143 master-0 kubenswrapper[29612]: I0319 12:27:11.731056 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbxlf\" (UniqueName: \"kubernetes.io/projected/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3-kube-api-access-fbxlf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:12.046321 master-0 kubenswrapper[29612]: I0319 12:27:12.046259 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h45hg" Mar 19 12:27:12.046612 master-0 kubenswrapper[29612]: I0319 12:27:12.046257 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h45hg" event={"ID":"d0737ffd-e31d-44b8-a26f-6d8c0f222bd3","Type":"ContainerDied","Data":"de90632974ae0d7301c28845114f80bb03948ad99419cb06cb0c6a5ed7349063"} Mar 19 12:27:12.046612 master-0 kubenswrapper[29612]: I0319 12:27:12.046419 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de90632974ae0d7301c28845114f80bb03948ad99419cb06cb0c6a5ed7349063" Mar 19 12:27:12.047992 master-0 kubenswrapper[29612]: I0319 12:27:12.047926 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vbjj" event={"ID":"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54","Type":"ContainerStarted","Data":"58733214d62e6e6a42b71277b5a7ed4a76028cc679769aed8b32530247284762"} Mar 19 12:27:12.498103 master-0 kubenswrapper[29612]: I0319 12:27:12.498037 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 12:27:14.393620 master-0 kubenswrapper[29612]: I0319 12:27:14.392121 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:27:14.400490 master-0 kubenswrapper[29612]: I0319 12:27:14.400388 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6e8fcd00-2db9-4463-85a5-e9a9ae7ea931-etc-swift\") pod \"swift-storage-0\" (UID: \"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931\") " pod="openstack/swift-storage-0" Mar 19 12:27:14.678572 master-0 kubenswrapper[29612]: I0319 12:27:14.672175 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 12:27:15.110162 master-0 kubenswrapper[29612]: I0319 12:27:15.110035 29612 generic.go:334] "Generic (PLEG): container finished" podID="58f5d134-96c4-49cc-890c-06251f80d2dd" containerID="7388821594d5d75b072d791cd49c86d3780d75e9772ce090b235857ae264f97f" exitCode=0 Mar 19 12:27:15.110162 master-0 kubenswrapper[29612]: I0319 12:27:15.110129 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58f5d134-96c4-49cc-890c-06251f80d2dd","Type":"ContainerDied","Data":"7388821594d5d75b072d791cd49c86d3780d75e9772ce090b235857ae264f97f"} Mar 19 12:27:15.114293 master-0 kubenswrapper[29612]: I0319 12:27:15.114249 29612 generic.go:334] "Generic (PLEG): container finished" podID="ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f" containerID="45e8823488586fa64a3e0ac2fe5a46d442bd682661bf3d10b0616d36e53f5f4c" exitCode=0 Mar 19 12:27:15.114381 master-0 kubenswrapper[29612]: I0319 12:27:15.114300 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f","Type":"ContainerDied","Data":"45e8823488586fa64a3e0ac2fe5a46d442bd682661bf3d10b0616d36e53f5f4c"} Mar 19 12:27:15.403916 master-0 kubenswrapper[29612]: I0319 12:27:15.403832 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 12:27:15.414816 master-0 kubenswrapper[29612]: W0319 12:27:15.414761 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e8fcd00_2db9_4463_85a5_e9a9ae7ea931.slice/crio-94857446ec02bb80449c6e7f15587d337d55b783cc84755b0e2f4fcfea206e3d WatchSource:0}: Error finding container 94857446ec02bb80449c6e7f15587d337d55b783cc84755b0e2f4fcfea206e3d: Status 404 returned error can't find the container with id 94857446ec02bb80449c6e7f15587d337d55b783cc84755b0e2f4fcfea206e3d Mar 19 12:27:15.631993 master-0 kubenswrapper[29612]: I0319 12:27:15.631870 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zb6lv" podUID="a0c736a7-20fb-4ca6-9870-d5f978444dcc" containerName="ovn-controller" probeResult="failure" output=< Mar 19 12:27:15.631993 master-0 kubenswrapper[29612]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 12:27:15.631993 master-0 kubenswrapper[29612]: > Mar 19 12:27:15.685030 master-0 kubenswrapper[29612]: I0319 12:27:15.684910 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:27:15.702465 master-0 kubenswrapper[29612]: I0319 12:27:15.702411 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8gh4l" Mar 19 12:27:16.128990 master-0 kubenswrapper[29612]: I0319 12:27:16.128870 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"58f5d134-96c4-49cc-890c-06251f80d2dd","Type":"ContainerStarted","Data":"c72190c94d2c93d876a7cde640b3381d52654a69376523acfd428013e00dd974"} Mar 19 12:27:16.129246 master-0 kubenswrapper[29612]: I0319 12:27:16.129133 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:27:16.130705 master-0 kubenswrapper[29612]: I0319 12:27:16.130605 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"94857446ec02bb80449c6e7f15587d337d55b783cc84755b0e2f4fcfea206e3d"} Mar 19 12:27:16.133581 master-0 kubenswrapper[29612]: I0319 12:27:16.133544 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ebde3b5c-2d1f-42dd-a8e7-79c545eebb7f","Type":"ContainerStarted","Data":"7d91d018015edda69b9e49aa7c1dc38137322e7051eab50fbafde3c050db019a"} Mar 19 12:27:17.981469 master-0 kubenswrapper[29612]: I0319 12:27:17.981393 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=57.881881879 podStartE2EDuration="1m40.981370992s" podCreationTimestamp="2026-03-19 12:25:37 +0000 UTC" firstStartedPulling="2026-03-19 12:25:57.168488684 +0000 UTC m=+964.482657715" lastFinishedPulling="2026-03-19 12:26:40.267977807 +0000 UTC m=+1007.582146828" observedRunningTime="2026-03-19 12:27:17.973400476 +0000 UTC m=+1045.287569497" watchObservedRunningTime="2026-03-19 12:27:17.981370992 +0000 UTC m=+1045.295540013" Mar 19 12:27:18.064518 master-0 kubenswrapper[29612]: I0319 12:27:18.064415 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=66.245306962 podStartE2EDuration="1m42.064389611s" podCreationTimestamp="2026-03-19 12:25:36 +0000 UTC" firstStartedPulling="2026-03-19 12:25:55.316672841 +0000 UTC m=+962.630841862" lastFinishedPulling="2026-03-19 12:26:31.13575549 +0000 UTC m=+998.449924511" observedRunningTime="2026-03-19 12:27:18.054960093 +0000 UTC m=+1045.369129134" watchObservedRunningTime="2026-03-19 12:27:18.064389611 +0000 UTC m=+1045.378558632" Mar 19 12:27:18.257785 master-0 kubenswrapper[29612]: I0319 12:27:18.255779 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zb6lv-config-4zh6p"] Mar 19 12:27:18.257785 master-0 kubenswrapper[29612]: E0319 12:27:18.256313 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" containerName="mariadb-account-create-update" Mar 19 12:27:18.257785 master-0 kubenswrapper[29612]: I0319 12:27:18.256328 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" containerName="mariadb-account-create-update" Mar 19 12:27:18.257785 master-0 kubenswrapper[29612]: I0319 12:27:18.256568 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" containerName="mariadb-account-create-update" Mar 19 12:27:18.257785 master-0 kubenswrapper[29612]: I0319 12:27:18.257284 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.259919 master-0 kubenswrapper[29612]: I0319 12:27:18.259880 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 12:27:18.278161 master-0 kubenswrapper[29612]: I0319 12:27:18.277990 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zb6lv-config-4zh6p"] Mar 19 12:27:18.316975 master-0 kubenswrapper[29612]: I0319 12:27:18.316826 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.316975 master-0 kubenswrapper[29612]: I0319 12:27:18.316904 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-scripts\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.316975 master-0 kubenswrapper[29612]: I0319 12:27:18.316954 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-additional-scripts\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.317278 master-0 kubenswrapper[29612]: I0319 12:27:18.317055 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run-ovn\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.317278 master-0 kubenswrapper[29612]: I0319 12:27:18.317116 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-log-ovn\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.317278 master-0 kubenswrapper[29612]: I0319 12:27:18.317161 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4sz\" (UniqueName: \"kubernetes.io/projected/d34f5f68-f79d-4439-a5b5-20e97d724894-kube-api-access-hs4sz\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.419967 master-0 kubenswrapper[29612]: I0319 12:27:18.419842 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run-ovn\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.419967 master-0 kubenswrapper[29612]: I0319 12:27:18.419955 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-log-ovn\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.420533 master-0 kubenswrapper[29612]: I0319 12:27:18.420038 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4sz\" (UniqueName: \"kubernetes.io/projected/d34f5f68-f79d-4439-a5b5-20e97d724894-kube-api-access-hs4sz\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.420533 master-0 kubenswrapper[29612]: I0319 12:27:18.420150 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.420533 master-0 kubenswrapper[29612]: I0319 12:27:18.420193 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-scripts\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.420533 master-0 kubenswrapper[29612]: I0319 12:27:18.420245 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-additional-scripts\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.423525 master-0 kubenswrapper[29612]: I0319 12:27:18.422660 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run-ovn\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.423525 master-0 kubenswrapper[29612]: I0319 12:27:18.422745 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-log-ovn\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.423525 master-0 kubenswrapper[29612]: I0319 12:27:18.423101 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.426038 master-0 kubenswrapper[29612]: I0319 12:27:18.425617 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-scripts\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.427021 master-0 kubenswrapper[29612]: I0319 12:27:18.426984 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-additional-scripts\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.445170 master-0 kubenswrapper[29612]: I0319 12:27:18.445123 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4sz\" (UniqueName: \"kubernetes.io/projected/d34f5f68-f79d-4439-a5b5-20e97d724894-kube-api-access-hs4sz\") pod \"ovn-controller-zb6lv-config-4zh6p\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:18.600437 master-0 kubenswrapper[29612]: I0319 12:27:18.599997 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:20.622392 master-0 kubenswrapper[29612]: I0319 12:27:20.622160 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zb6lv" podUID="a0c736a7-20fb-4ca6-9870-d5f978444dcc" containerName="ovn-controller" probeResult="failure" output=< Mar 19 12:27:20.622392 master-0 kubenswrapper[29612]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 12:27:20.622392 master-0 kubenswrapper[29612]: > Mar 19 12:27:24.300662 master-0 kubenswrapper[29612]: I0319 12:27:24.300542 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 12:27:25.639863 master-0 kubenswrapper[29612]: I0319 12:27:25.639717 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zb6lv" podUID="a0c736a7-20fb-4ca6-9870-d5f978444dcc" containerName="ovn-controller" probeResult="failure" output=< Mar 19 12:27:25.639863 master-0 kubenswrapper[29612]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 12:27:25.639863 master-0 kubenswrapper[29612]: > Mar 19 12:27:26.653947 master-0 kubenswrapper[29612]: I0319 12:27:26.653861 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zb6lv-config-4zh6p"] Mar 19 12:27:27.310659 master-0 kubenswrapper[29612]: I0319 12:27:27.310106 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vbjj" event={"ID":"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54","Type":"ContainerStarted","Data":"69f7eacc49fe4d7ed953c9332c2371d4cde490d77b8ed5f5dbd695f3a028c469"} Mar 19 12:27:27.318654 master-0 kubenswrapper[29612]: I0319 12:27:27.314970 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"970ecc9146877f35292da4865d5cd92db9a15ce9e7a7a4b13e2da57788b3d487"} Mar 19 12:27:27.318654 master-0 kubenswrapper[29612]: I0319 12:27:27.315031 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"568133cc08acc5354a874a5c47a8e7309dfb75deaba6ec22a312da294c4e316c"} Mar 19 12:27:27.318654 master-0 kubenswrapper[29612]: I0319 12:27:27.315051 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"4b9118feef778520367ca9548136888c816def589913150ee7ce9f183d498421"} Mar 19 12:27:27.318654 master-0 kubenswrapper[29612]: I0319 12:27:27.315066 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"c2db77d9bf7d5fd98eba3c2d6cb1167ceca0958a03775bf2c2b70a5dbd8c990c"} Mar 19 12:27:27.322648 master-0 kubenswrapper[29612]: I0319 12:27:27.319917 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zb6lv-config-4zh6p" event={"ID":"d34f5f68-f79d-4439-a5b5-20e97d724894","Type":"ContainerStarted","Data":"ac5ff4a93d1172f5ff001524375c5ce0d6da81faf18150685844f991f1d708dc"} Mar 19 12:27:27.322648 master-0 kubenswrapper[29612]: I0319 12:27:27.319963 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zb6lv-config-4zh6p" event={"ID":"d34f5f68-f79d-4439-a5b5-20e97d724894","Type":"ContainerStarted","Data":"9729ffc898d60d92b85057dbc8b33e415217361651cc440041421143ef2e6b2a"} Mar 19 12:27:27.370261 master-0 kubenswrapper[29612]: I0319 12:27:27.370175 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zb6lv-config-4zh6p" podStartSLOduration=9.370152262 podStartE2EDuration="9.370152262s" podCreationTimestamp="2026-03-19 12:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:27:27.368567157 +0000 UTC m=+1054.682736168" watchObservedRunningTime="2026-03-19 12:27:27.370152262 +0000 UTC m=+1054.684321293" Mar 19 12:27:27.371727 master-0 kubenswrapper[29612]: I0319 12:27:27.371686 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7vbjj" podStartSLOduration=3.080958829 podStartE2EDuration="17.371678735s" podCreationTimestamp="2026-03-19 12:27:10 +0000 UTC" firstStartedPulling="2026-03-19 12:27:11.704932917 +0000 UTC m=+1039.019101938" lastFinishedPulling="2026-03-19 12:27:25.995652813 +0000 UTC m=+1053.309821844" observedRunningTime="2026-03-19 12:27:27.343102913 +0000 UTC m=+1054.657271934" watchObservedRunningTime="2026-03-19 12:27:27.371678735 +0000 UTC m=+1054.685847776" Mar 19 12:27:28.334176 master-0 kubenswrapper[29612]: I0319 12:27:28.334105 29612 generic.go:334] "Generic (PLEG): container finished" podID="d34f5f68-f79d-4439-a5b5-20e97d724894" containerID="ac5ff4a93d1172f5ff001524375c5ce0d6da81faf18150685844f991f1d708dc" exitCode=0 Mar 19 12:27:28.334863 master-0 kubenswrapper[29612]: I0319 12:27:28.334181 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zb6lv-config-4zh6p" event={"ID":"d34f5f68-f79d-4439-a5b5-20e97d724894","Type":"ContainerDied","Data":"ac5ff4a93d1172f5ff001524375c5ce0d6da81faf18150685844f991f1d708dc"} Mar 19 12:27:28.516384 master-0 kubenswrapper[29612]: I0319 12:27:28.516319 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 12:27:29.848619 master-0 kubenswrapper[29612]: I0319 12:27:29.848586 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:29.891295 master-0 kubenswrapper[29612]: I0319 12:27:29.891223 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-scripts\") pod \"d34f5f68-f79d-4439-a5b5-20e97d724894\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " Mar 19 12:27:29.891537 master-0 kubenswrapper[29612]: I0319 12:27:29.891470 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run\") pod \"d34f5f68-f79d-4439-a5b5-20e97d724894\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " Mar 19 12:27:29.891627 master-0 kubenswrapper[29612]: I0319 12:27:29.891597 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-log-ovn\") pod \"d34f5f68-f79d-4439-a5b5-20e97d724894\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " Mar 19 12:27:29.891759 master-0 kubenswrapper[29612]: I0319 12:27:29.891593 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run" (OuterVolumeSpecName: "var-run") pod "d34f5f68-f79d-4439-a5b5-20e97d724894" (UID: "d34f5f68-f79d-4439-a5b5-20e97d724894"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:27:29.891811 master-0 kubenswrapper[29612]: I0319 12:27:29.891642 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d34f5f68-f79d-4439-a5b5-20e97d724894" (UID: "d34f5f68-f79d-4439-a5b5-20e97d724894"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:27:29.891811 master-0 kubenswrapper[29612]: I0319 12:27:29.891770 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run-ovn\") pod \"d34f5f68-f79d-4439-a5b5-20e97d724894\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " Mar 19 12:27:29.891907 master-0 kubenswrapper[29612]: I0319 12:27:29.891842 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d34f5f68-f79d-4439-a5b5-20e97d724894" (UID: "d34f5f68-f79d-4439-a5b5-20e97d724894"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:27:29.891907 master-0 kubenswrapper[29612]: I0319 12:27:29.891882 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs4sz\" (UniqueName: \"kubernetes.io/projected/d34f5f68-f79d-4439-a5b5-20e97d724894-kube-api-access-hs4sz\") pod \"d34f5f68-f79d-4439-a5b5-20e97d724894\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " Mar 19 12:27:29.891996 master-0 kubenswrapper[29612]: I0319 12:27:29.891925 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-additional-scripts\") pod \"d34f5f68-f79d-4439-a5b5-20e97d724894\" (UID: \"d34f5f68-f79d-4439-a5b5-20e97d724894\") " Mar 19 12:27:29.892531 master-0 kubenswrapper[29612]: I0319 12:27:29.892482 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-scripts" (OuterVolumeSpecName: "scripts") pod "d34f5f68-f79d-4439-a5b5-20e97d724894" (UID: "d34f5f68-f79d-4439-a5b5-20e97d724894"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:29.892941 master-0 kubenswrapper[29612]: I0319 12:27:29.892904 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d34f5f68-f79d-4439-a5b5-20e97d724894" (UID: "d34f5f68-f79d-4439-a5b5-20e97d724894"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:29.893238 master-0 kubenswrapper[29612]: I0319 12:27:29.893208 29612 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:29.893315 master-0 kubenswrapper[29612]: I0319 12:27:29.893237 29612 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:29.893315 master-0 kubenswrapper[29612]: I0319 12:27:29.893256 29612 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d34f5f68-f79d-4439-a5b5-20e97d724894-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:29.893315 master-0 kubenswrapper[29612]: I0319 12:27:29.893278 29612 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:29.893315 master-0 kubenswrapper[29612]: I0319 12:27:29.893294 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d34f5f68-f79d-4439-a5b5-20e97d724894-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:29.896579 master-0 kubenswrapper[29612]: I0319 12:27:29.896535 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34f5f68-f79d-4439-a5b5-20e97d724894-kube-api-access-hs4sz" (OuterVolumeSpecName: "kube-api-access-hs4sz") pod "d34f5f68-f79d-4439-a5b5-20e97d724894" (UID: "d34f5f68-f79d-4439-a5b5-20e97d724894"). InnerVolumeSpecName "kube-api-access-hs4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:30.014311 master-0 kubenswrapper[29612]: I0319 12:27:30.011456 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs4sz\" (UniqueName: \"kubernetes.io/projected/d34f5f68-f79d-4439-a5b5-20e97d724894-kube-api-access-hs4sz\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:30.357464 master-0 kubenswrapper[29612]: I0319 12:27:30.357399 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zb6lv-config-4zh6p" event={"ID":"d34f5f68-f79d-4439-a5b5-20e97d724894","Type":"ContainerDied","Data":"9729ffc898d60d92b85057dbc8b33e415217361651cc440041421143ef2e6b2a"} Mar 19 12:27:30.357464 master-0 kubenswrapper[29612]: I0319 12:27:30.357462 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9729ffc898d60d92b85057dbc8b33e415217361651cc440041421143ef2e6b2a" Mar 19 12:27:30.357727 master-0 kubenswrapper[29612]: I0319 12:27:30.357532 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zb6lv-config-4zh6p" Mar 19 12:27:30.392725 master-0 kubenswrapper[29612]: I0319 12:27:30.392506 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"b17045534317f1dbfa3cc5ef6e78b4e8ea7395c970fd650b2901a815b7f706ae"} Mar 19 12:27:30.392725 master-0 kubenswrapper[29612]: I0319 12:27:30.392583 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"9064f289399832b8146b306d168296697eb75ae98200d76b19d7d624fdf15b6e"} Mar 19 12:27:30.392725 master-0 kubenswrapper[29612]: I0319 12:27:30.392604 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"b012f4eb7fb652e876c321da7264239157c2556c8dbe5a1e7886e294370324e6"} Mar 19 12:27:30.392725 master-0 kubenswrapper[29612]: I0319 12:27:30.392619 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"6cd2aa973e79a0d9547429772af88d332dd0e3a52d6f626b03f619bb6e6619d9"} Mar 19 12:27:30.631174 master-0 kubenswrapper[29612]: I0319 12:27:30.631110 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zb6lv" Mar 19 12:27:31.032702 master-0 kubenswrapper[29612]: I0319 12:27:31.032615 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zb6lv-config-4zh6p"] Mar 19 12:27:31.044117 master-0 kubenswrapper[29612]: I0319 12:27:31.044057 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zb6lv-config-4zh6p"] Mar 19 12:27:32.955662 master-0 kubenswrapper[29612]: I0319 12:27:32.954918 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34f5f68-f79d-4439-a5b5-20e97d724894" path="/var/lib/kubelet/pods/d34f5f68-f79d-4439-a5b5-20e97d724894/volumes" Mar 19 12:27:34.304122 master-0 kubenswrapper[29612]: I0319 12:27:34.303997 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 12:27:34.481652 master-0 kubenswrapper[29612]: I0319 12:27:34.481066 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"9a76cb71a680f9855e533e1049e3075f5e57b2d509091d349e9b400f0697437b"} Mar 19 12:27:34.481652 master-0 kubenswrapper[29612]: I0319 12:27:34.481152 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"8272a0ed291a30bc2817b29ef0e221cc16d5edaab4b0ed8fab056525269fdd23"} Mar 19 12:27:34.481652 master-0 kubenswrapper[29612]: I0319 12:27:34.481165 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"580493e23e295ef5f68c9bba5bf43027eccd130a043992156d1fdfd5e940a049"} Mar 19 12:27:34.481652 master-0 kubenswrapper[29612]: I0319 12:27:34.481176 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"737a2eabfc315d41d6009817382de516628699d15968bd4c3d5c3ea57c5ade9e"} Mar 19 12:27:34.481652 master-0 kubenswrapper[29612]: I0319 12:27:34.481187 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"63de88ba6cb23e150c55e5367cff2f7ac981e0da35c43f75f5f702b08d9140b8"} Mar 19 12:27:34.933742 master-0 kubenswrapper[29612]: I0319 12:27:34.933669 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7sxqt"] Mar 19 12:27:34.934205 master-0 kubenswrapper[29612]: E0319 12:27:34.934174 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34f5f68-f79d-4439-a5b5-20e97d724894" containerName="ovn-config" Mar 19 12:27:34.934205 master-0 kubenswrapper[29612]: I0319 12:27:34.934204 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34f5f68-f79d-4439-a5b5-20e97d724894" containerName="ovn-config" Mar 19 12:27:34.934544 master-0 kubenswrapper[29612]: I0319 12:27:34.934520 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34f5f68-f79d-4439-a5b5-20e97d724894" containerName="ovn-config" Mar 19 12:27:34.935549 master-0 kubenswrapper[29612]: I0319 12:27:34.935507 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:34.966698 master-0 kubenswrapper[29612]: I0319 12:27:34.966585 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7sxqt"] Mar 19 12:27:35.039159 master-0 kubenswrapper[29612]: I0319 12:27:35.039101 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppmfz\" (UniqueName: \"kubernetes.io/projected/08724d0d-69ad-4a95-98ae-0d0707bbaee8-kube-api-access-ppmfz\") pod \"cinder-db-create-7sxqt\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.039412 master-0 kubenswrapper[29612]: I0319 12:27:35.039249 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08724d0d-69ad-4a95-98ae-0d0707bbaee8-operator-scripts\") pod \"cinder-db-create-7sxqt\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.124529 master-0 kubenswrapper[29612]: I0319 12:27:35.124466 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a786-account-create-update-q6mp4"] Mar 19 12:27:35.126360 master-0 kubenswrapper[29612]: I0319 12:27:35.126307 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.129621 master-0 kubenswrapper[29612]: I0319 12:27:35.129521 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 12:27:35.141413 master-0 kubenswrapper[29612]: I0319 12:27:35.141052 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08724d0d-69ad-4a95-98ae-0d0707bbaee8-operator-scripts\") pod \"cinder-db-create-7sxqt\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.141413 master-0 kubenswrapper[29612]: I0319 12:27:35.141207 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppmfz\" (UniqueName: \"kubernetes.io/projected/08724d0d-69ad-4a95-98ae-0d0707bbaee8-kube-api-access-ppmfz\") pod \"cinder-db-create-7sxqt\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.142410 master-0 kubenswrapper[29612]: I0319 12:27:35.142168 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08724d0d-69ad-4a95-98ae-0d0707bbaee8-operator-scripts\") pod \"cinder-db-create-7sxqt\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.162747 master-0 kubenswrapper[29612]: I0319 12:27:35.162680 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a786-account-create-update-q6mp4"] Mar 19 12:27:35.186916 master-0 kubenswrapper[29612]: I0319 12:27:35.173918 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppmfz\" (UniqueName: \"kubernetes.io/projected/08724d0d-69ad-4a95-98ae-0d0707bbaee8-kube-api-access-ppmfz\") pod \"cinder-db-create-7sxqt\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.243166 master-0 kubenswrapper[29612]: I0319 12:27:35.242977 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzdw\" (UniqueName: \"kubernetes.io/projected/0ccd0fb5-3e98-4209-ac70-c238a7610668-kube-api-access-fqzdw\") pod \"cinder-a786-account-create-update-q6mp4\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.243718 master-0 kubenswrapper[29612]: I0319 12:27:35.243698 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ccd0fb5-3e98-4209-ac70-c238a7610668-operator-scripts\") pod \"cinder-a786-account-create-update-q6mp4\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.264699 master-0 kubenswrapper[29612]: I0319 12:27:35.263072 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:35.347761 master-0 kubenswrapper[29612]: I0319 12:27:35.347691 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzdw\" (UniqueName: \"kubernetes.io/projected/0ccd0fb5-3e98-4209-ac70-c238a7610668-kube-api-access-fqzdw\") pod \"cinder-a786-account-create-update-q6mp4\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.348304 master-0 kubenswrapper[29612]: I0319 12:27:35.347849 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ccd0fb5-3e98-4209-ac70-c238a7610668-operator-scripts\") pod \"cinder-a786-account-create-update-q6mp4\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.349698 master-0 kubenswrapper[29612]: I0319 12:27:35.349400 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ccd0fb5-3e98-4209-ac70-c238a7610668-operator-scripts\") pod \"cinder-a786-account-create-update-q6mp4\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.375982 master-0 kubenswrapper[29612]: I0319 12:27:35.375909 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vjfrd"] Mar 19 12:27:35.381691 master-0 kubenswrapper[29612]: I0319 12:27:35.381650 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzdw\" (UniqueName: \"kubernetes.io/projected/0ccd0fb5-3e98-4209-ac70-c238a7610668-kube-api-access-fqzdw\") pod \"cinder-a786-account-create-update-q6mp4\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.382314 master-0 kubenswrapper[29612]: I0319 12:27:35.382281 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.417157 master-0 kubenswrapper[29612]: I0319 12:27:35.417103 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vjfrd"] Mar 19 12:27:35.447063 master-0 kubenswrapper[29612]: I0319 12:27:35.447010 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:35.451437 master-0 kubenswrapper[29612]: I0319 12:27:35.450152 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe770bd-975c-429f-9998-a4af416c5c1d-operator-scripts\") pod \"neutron-db-create-vjfrd\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.451437 master-0 kubenswrapper[29612]: I0319 12:27:35.450306 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbqs9\" (UniqueName: \"kubernetes.io/projected/ffe770bd-975c-429f-9998-a4af416c5c1d-kube-api-access-gbqs9\") pod \"neutron-db-create-vjfrd\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.523436 master-0 kubenswrapper[29612]: I0319 12:27:35.523090 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"8def9ac210acaa5132db72d5119e52c2156b3c38bdea48b42a7f650ca49d704f"} Mar 19 12:27:35.523436 master-0 kubenswrapper[29612]: I0319 12:27:35.523150 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6e8fcd00-2db9-4463-85a5-e9a9ae7ea931","Type":"ContainerStarted","Data":"c211b7c640a16bdcb04b3b5741b82231dc2e69d5f6e32210af824eaf2169202e"} Mar 19 12:27:35.552584 master-0 kubenswrapper[29612]: I0319 12:27:35.552544 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbqs9\" (UniqueName: \"kubernetes.io/projected/ffe770bd-975c-429f-9998-a4af416c5c1d-kube-api-access-gbqs9\") pod \"neutron-db-create-vjfrd\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.552881 master-0 kubenswrapper[29612]: I0319 12:27:35.552864 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe770bd-975c-429f-9998-a4af416c5c1d-operator-scripts\") pod \"neutron-db-create-vjfrd\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.553686 master-0 kubenswrapper[29612]: I0319 12:27:35.553669 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe770bd-975c-429f-9998-a4af416c5c1d-operator-scripts\") pod \"neutron-db-create-vjfrd\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.704321 master-0 kubenswrapper[29612]: I0319 12:27:35.704214 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbqs9\" (UniqueName: \"kubernetes.io/projected/ffe770bd-975c-429f-9998-a4af416c5c1d-kube-api-access-gbqs9\") pod \"neutron-db-create-vjfrd\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.742606 master-0 kubenswrapper[29612]: I0319 12:27:35.742334 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9127-account-create-update-b86lg"] Mar 19 12:27:35.749801 master-0 kubenswrapper[29612]: I0319 12:27:35.747371 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:35.750128 master-0 kubenswrapper[29612]: I0319 12:27:35.749988 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 12:27:35.781998 master-0 kubenswrapper[29612]: I0319 12:27:35.781911 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9127-account-create-update-b86lg"] Mar 19 12:27:35.807702 master-0 kubenswrapper[29612]: I0319 12:27:35.807611 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.986745615 podStartE2EDuration="55.807587244s" podCreationTimestamp="2026-03-19 12:26:40 +0000 UTC" firstStartedPulling="2026-03-19 12:27:15.418298739 +0000 UTC m=+1042.732467780" lastFinishedPulling="2026-03-19 12:27:33.239140388 +0000 UTC m=+1060.553309409" observedRunningTime="2026-03-19 12:27:35.724238866 +0000 UTC m=+1063.038407887" watchObservedRunningTime="2026-03-19 12:27:35.807587244 +0000 UTC m=+1063.121756265" Mar 19 12:27:35.814835 master-0 kubenswrapper[29612]: I0319 12:27:35.814794 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:35.865672 master-0 kubenswrapper[29612]: I0319 12:27:35.864762 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c524a81-fbea-432b-856d-1688012185e2-operator-scripts\") pod \"neutron-9127-account-create-update-b86lg\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:35.871893 master-0 kubenswrapper[29612]: I0319 12:27:35.870139 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdm5h\" (UniqueName: \"kubernetes.io/projected/6c524a81-fbea-432b-856d-1688012185e2-kube-api-access-vdm5h\") pod \"neutron-9127-account-create-update-b86lg\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:35.875670 master-0 kubenswrapper[29612]: I0319 12:27:35.875280 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7sxqt"] Mar 19 12:27:35.904910 master-0 kubenswrapper[29612]: I0319 12:27:35.902870 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-26vqt"] Mar 19 12:27:35.904910 master-0 kubenswrapper[29612]: I0319 12:27:35.904754 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:35.907273 master-0 kubenswrapper[29612]: I0319 12:27:35.906840 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 12:27:35.907273 master-0 kubenswrapper[29612]: I0319 12:27:35.906846 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 12:27:35.925656 master-0 kubenswrapper[29612]: I0319 12:27:35.918980 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 12:27:35.938944 master-0 kubenswrapper[29612]: I0319 12:27:35.938481 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-26vqt"] Mar 19 12:27:35.982570 master-0 kubenswrapper[29612]: I0319 12:27:35.982493 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdm5h\" (UniqueName: \"kubernetes.io/projected/6c524a81-fbea-432b-856d-1688012185e2-kube-api-access-vdm5h\") pod \"neutron-9127-account-create-update-b86lg\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:35.982789 master-0 kubenswrapper[29612]: I0319 12:27:35.982587 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-combined-ca-bundle\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:35.982789 master-0 kubenswrapper[29612]: I0319 12:27:35.982757 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-config-data\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:35.982866 master-0 kubenswrapper[29612]: I0319 12:27:35.982833 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt89q\" (UniqueName: \"kubernetes.io/projected/2ef05b5e-8998-4c2c-a732-84cc8345df7c-kube-api-access-xt89q\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:35.982986 master-0 kubenswrapper[29612]: I0319 12:27:35.982952 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c524a81-fbea-432b-856d-1688012185e2-operator-scripts\") pod \"neutron-9127-account-create-update-b86lg\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:35.986245 master-0 kubenswrapper[29612]: I0319 12:27:35.986213 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c524a81-fbea-432b-856d-1688012185e2-operator-scripts\") pod \"neutron-9127-account-create-update-b86lg\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:36.031743 master-0 kubenswrapper[29612]: I0319 12:27:36.031365 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdm5h\" (UniqueName: \"kubernetes.io/projected/6c524a81-fbea-432b-856d-1688012185e2-kube-api-access-vdm5h\") pod \"neutron-9127-account-create-update-b86lg\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:36.088728 master-0 kubenswrapper[29612]: I0319 12:27:36.085711 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:36.088728 master-0 kubenswrapper[29612]: I0319 12:27:36.086884 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-config-data\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.088728 master-0 kubenswrapper[29612]: I0319 12:27:36.086951 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt89q\" (UniqueName: \"kubernetes.io/projected/2ef05b5e-8998-4c2c-a732-84cc8345df7c-kube-api-access-xt89q\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.088728 master-0 kubenswrapper[29612]: I0319 12:27:36.087103 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-combined-ca-bundle\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.092710 master-0 kubenswrapper[29612]: I0319 12:27:36.090892 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-config-data\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.092710 master-0 kubenswrapper[29612]: I0319 12:27:36.091434 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-combined-ca-bundle\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.211059 master-0 kubenswrapper[29612]: I0319 12:27:36.210962 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt89q\" (UniqueName: \"kubernetes.io/projected/2ef05b5e-8998-4c2c-a732-84cc8345df7c-kube-api-access-xt89q\") pod \"keystone-db-sync-26vqt\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.215288 master-0 kubenswrapper[29612]: W0319 12:27:36.215229 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ccd0fb5_3e98_4209_ac70_c238a7610668.slice/crio-01dc9a4a48cd85c0a8591bccd5eb9e82d203d379e1852ab669d289afe7ba42b5 WatchSource:0}: Error finding container 01dc9a4a48cd85c0a8591bccd5eb9e82d203d379e1852ab669d289afe7ba42b5: Status 404 returned error can't find the container with id 01dc9a4a48cd85c0a8591bccd5eb9e82d203d379e1852ab669d289afe7ba42b5 Mar 19 12:27:36.238782 master-0 kubenswrapper[29612]: I0319 12:27:36.235387 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a786-account-create-update-q6mp4"] Mar 19 12:27:36.245902 master-0 kubenswrapper[29612]: I0319 12:27:36.241127 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:36.461596 master-0 kubenswrapper[29612]: I0319 12:27:36.461541 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vjfrd"] Mar 19 12:27:36.536339 master-0 kubenswrapper[29612]: I0319 12:27:36.536104 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7sxqt" event={"ID":"08724d0d-69ad-4a95-98ae-0d0707bbaee8","Type":"ContainerStarted","Data":"1a59905f71e35e381b7ee2ec16dc4ca9a2e2436f53d578e17009cd355daadedd"} Mar 19 12:27:36.536339 master-0 kubenswrapper[29612]: I0319 12:27:36.536153 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7sxqt" event={"ID":"08724d0d-69ad-4a95-98ae-0d0707bbaee8","Type":"ContainerStarted","Data":"af7e618ac4bc1fe5ec2e4a79d56b07f4faa4e844f9e43a4633a0e423a9426367"} Mar 19 12:27:36.539985 master-0 kubenswrapper[29612]: I0319 12:27:36.539890 29612 generic.go:334] "Generic (PLEG): container finished" podID="16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" containerID="69f7eacc49fe4d7ed953c9332c2371d4cde490d77b8ed5f5dbd695f3a028c469" exitCode=0 Mar 19 12:27:36.539985 master-0 kubenswrapper[29612]: I0319 12:27:36.539952 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vbjj" event={"ID":"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54","Type":"ContainerDied","Data":"69f7eacc49fe4d7ed953c9332c2371d4cde490d77b8ed5f5dbd695f3a028c469"} Mar 19 12:27:36.541363 master-0 kubenswrapper[29612]: I0319 12:27:36.541332 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vjfrd" event={"ID":"ffe770bd-975c-429f-9998-a4af416c5c1d","Type":"ContainerStarted","Data":"51b0833c74071b3b14f4eb2658e9419eaf32fc331abc33d547741936b3d12077"} Mar 19 12:27:36.544876 master-0 kubenswrapper[29612]: I0319 12:27:36.544841 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a786-account-create-update-q6mp4" event={"ID":"0ccd0fb5-3e98-4209-ac70-c238a7610668","Type":"ContainerStarted","Data":"47076e3f74fbbb246d96f5a88e4c23d83b00c7bcf966d21c99a1a6b54763360f"} Mar 19 12:27:36.544876 master-0 kubenswrapper[29612]: I0319 12:27:36.544877 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a786-account-create-update-q6mp4" event={"ID":"0ccd0fb5-3e98-4209-ac70-c238a7610668","Type":"ContainerStarted","Data":"01dc9a4a48cd85c0a8591bccd5eb9e82d203d379e1852ab669d289afe7ba42b5"} Mar 19 12:27:36.632712 master-0 kubenswrapper[29612]: I0319 12:27:36.622952 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-7sxqt" podStartSLOduration=2.622932074 podStartE2EDuration="2.622932074s" podCreationTimestamp="2026-03-19 12:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:27:36.578077239 +0000 UTC m=+1063.892246250" watchObservedRunningTime="2026-03-19 12:27:36.622932074 +0000 UTC m=+1063.937101095" Mar 19 12:27:36.632712 master-0 kubenswrapper[29612]: I0319 12:27:36.626141 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9127-account-create-update-b86lg"] Mar 19 12:27:36.751100 master-0 kubenswrapper[29612]: I0319 12:27:36.749115 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-857c69d577-2vlg7"] Mar 19 12:27:36.751505 master-0 kubenswrapper[29612]: I0319 12:27:36.751440 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.760192 master-0 kubenswrapper[29612]: I0319 12:27:36.760128 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 12:27:36.769021 master-0 kubenswrapper[29612]: I0319 12:27:36.767323 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857c69d577-2vlg7"] Mar 19 12:27:36.810312 master-0 kubenswrapper[29612]: I0319 12:27:36.810004 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-26vqt"] Mar 19 12:27:36.831548 master-0 kubenswrapper[29612]: I0319 12:27:36.831197 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-swift-storage-0\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.831548 master-0 kubenswrapper[29612]: I0319 12:27:36.831295 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-config\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.831834 master-0 kubenswrapper[29612]: I0319 12:27:36.831565 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjkl4\" (UniqueName: \"kubernetes.io/projected/99613eeb-1aae-43ea-a24a-6b800e4fed37-kube-api-access-kjkl4\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.831834 master-0 kubenswrapper[29612]: I0319 12:27:36.831660 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-svc\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.831834 master-0 kubenswrapper[29612]: I0319 12:27:36.831700 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-nb\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.831834 master-0 kubenswrapper[29612]: I0319 12:27:36.831792 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-sb\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.933457 master-0 kubenswrapper[29612]: I0319 12:27:36.933410 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjkl4\" (UniqueName: \"kubernetes.io/projected/99613eeb-1aae-43ea-a24a-6b800e4fed37-kube-api-access-kjkl4\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.933579 master-0 kubenswrapper[29612]: I0319 12:27:36.933478 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-svc\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.933579 master-0 kubenswrapper[29612]: I0319 12:27:36.933503 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-nb\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.933813 master-0 kubenswrapper[29612]: I0319 12:27:36.933775 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-sb\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935282 master-0 kubenswrapper[29612]: I0319 12:27:36.933996 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-swift-storage-0\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935282 master-0 kubenswrapper[29612]: I0319 12:27:36.934039 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-config\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935282 master-0 kubenswrapper[29612]: I0319 12:27:36.934844 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-svc\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935282 master-0 kubenswrapper[29612]: I0319 12:27:36.934879 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-config\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935282 master-0 kubenswrapper[29612]: I0319 12:27:36.934933 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-swift-storage-0\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935282 master-0 kubenswrapper[29612]: I0319 12:27:36.935243 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-sb\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.935524 master-0 kubenswrapper[29612]: I0319 12:27:36.935358 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-nb\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:36.957558 master-0 kubenswrapper[29612]: I0319 12:27:36.957015 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjkl4\" (UniqueName: \"kubernetes.io/projected/99613eeb-1aae-43ea-a24a-6b800e4fed37-kube-api-access-kjkl4\") pod \"dnsmasq-dns-857c69d577-2vlg7\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:37.111704 master-0 kubenswrapper[29612]: I0319 12:27:37.111628 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:37.564592 master-0 kubenswrapper[29612]: I0319 12:27:37.564541 29612 generic.go:334] "Generic (PLEG): container finished" podID="0ccd0fb5-3e98-4209-ac70-c238a7610668" containerID="47076e3f74fbbb246d96f5a88e4c23d83b00c7bcf966d21c99a1a6b54763360f" exitCode=0 Mar 19 12:27:37.564592 master-0 kubenswrapper[29612]: I0319 12:27:37.564610 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a786-account-create-update-q6mp4" event={"ID":"0ccd0fb5-3e98-4209-ac70-c238a7610668","Type":"ContainerDied","Data":"47076e3f74fbbb246d96f5a88e4c23d83b00c7bcf966d21c99a1a6b54763360f"} Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.568312 29612 generic.go:334] "Generic (PLEG): container finished" podID="08724d0d-69ad-4a95-98ae-0d0707bbaee8" containerID="1a59905f71e35e381b7ee2ec16dc4ca9a2e2436f53d578e17009cd355daadedd" exitCode=0 Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.568355 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7sxqt" event={"ID":"08724d0d-69ad-4a95-98ae-0d0707bbaee8","Type":"ContainerDied","Data":"1a59905f71e35e381b7ee2ec16dc4ca9a2e2436f53d578e17009cd355daadedd"} Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.570355 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-26vqt" event={"ID":"2ef05b5e-8998-4c2c-a732-84cc8345df7c","Type":"ContainerStarted","Data":"5bbeb91bb5251a83cf2a08a1c720b8fad7de4d1b7ee7e949c3f0fe94fd1f3755"} Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.570869 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-857c69d577-2vlg7"] Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.573927 29612 generic.go:334] "Generic (PLEG): container finished" podID="6c524a81-fbea-432b-856d-1688012185e2" containerID="1b56718cae6a03d93e8a0de7b6c68f76a0efae2d93240c2ce9671a2bc917170f" exitCode=0 Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.573990 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9127-account-create-update-b86lg" event={"ID":"6c524a81-fbea-432b-856d-1688012185e2","Type":"ContainerDied","Data":"1b56718cae6a03d93e8a0de7b6c68f76a0efae2d93240c2ce9671a2bc917170f"} Mar 19 12:27:37.574797 master-0 kubenswrapper[29612]: I0319 12:27:37.574015 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9127-account-create-update-b86lg" event={"ID":"6c524a81-fbea-432b-856d-1688012185e2","Type":"ContainerStarted","Data":"1678371d521e77effa2cba41846b4492327816734f6965ec6a564cd937c6c091"} Mar 19 12:27:37.594344 master-0 kubenswrapper[29612]: I0319 12:27:37.594172 29612 generic.go:334] "Generic (PLEG): container finished" podID="ffe770bd-975c-429f-9998-a4af416c5c1d" containerID="dfec1fd2f12a1aa341116cba503de02b1017d97d721b665cfa4408fdab14caa6" exitCode=0 Mar 19 12:27:37.594344 master-0 kubenswrapper[29612]: I0319 12:27:37.594253 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vjfrd" event={"ID":"ffe770bd-975c-429f-9998-a4af416c5c1d","Type":"ContainerDied","Data":"dfec1fd2f12a1aa341116cba503de02b1017d97d721b665cfa4408fdab14caa6"} Mar 19 12:27:38.136717 master-0 kubenswrapper[29612]: I0319 12:27:38.136666 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:38.287402 master-0 kubenswrapper[29612]: I0319 12:27:38.287348 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-config-data\") pod \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " Mar 19 12:27:38.287402 master-0 kubenswrapper[29612]: I0319 12:27:38.287415 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tftrv\" (UniqueName: \"kubernetes.io/projected/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-kube-api-access-tftrv\") pod \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " Mar 19 12:27:38.287771 master-0 kubenswrapper[29612]: I0319 12:27:38.287532 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-db-sync-config-data\") pod \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " Mar 19 12:27:38.287771 master-0 kubenswrapper[29612]: I0319 12:27:38.287647 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-combined-ca-bundle\") pod \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\" (UID: \"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54\") " Mar 19 12:27:38.291517 master-0 kubenswrapper[29612]: I0319 12:27:38.291304 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-kube-api-access-tftrv" (OuterVolumeSpecName: "kube-api-access-tftrv") pod "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" (UID: "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54"). InnerVolumeSpecName "kube-api-access-tftrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:38.296498 master-0 kubenswrapper[29612]: I0319 12:27:38.296445 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" (UID: "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:38.318889 master-0 kubenswrapper[29612]: I0319 12:27:38.318809 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" (UID: "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:38.369230 master-0 kubenswrapper[29612]: I0319 12:27:38.369094 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-config-data" (OuterVolumeSpecName: "config-data") pod "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" (UID: "16a901b3-60ab-4fe0-ab8b-49d5b7c30b54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:38.390568 master-0 kubenswrapper[29612]: I0319 12:27:38.390495 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:38.390698 master-0 kubenswrapper[29612]: I0319 12:27:38.390597 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tftrv\" (UniqueName: \"kubernetes.io/projected/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-kube-api-access-tftrv\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:38.390698 master-0 kubenswrapper[29612]: I0319 12:27:38.390624 29612 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:38.390698 master-0 kubenswrapper[29612]: I0319 12:27:38.390652 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:38.610841 master-0 kubenswrapper[29612]: I0319 12:27:38.610783 29612 generic.go:334] "Generic (PLEG): container finished" podID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerID="a61364302f5bf365dd1688523771a5125afdbed22e0b4b91c61ee550ab76ff4a" exitCode=0 Mar 19 12:27:38.611442 master-0 kubenswrapper[29612]: I0319 12:27:38.610880 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" event={"ID":"99613eeb-1aae-43ea-a24a-6b800e4fed37","Type":"ContainerDied","Data":"a61364302f5bf365dd1688523771a5125afdbed22e0b4b91c61ee550ab76ff4a"} Mar 19 12:27:38.611442 master-0 kubenswrapper[29612]: I0319 12:27:38.610913 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" event={"ID":"99613eeb-1aae-43ea-a24a-6b800e4fed37","Type":"ContainerStarted","Data":"bf7fc6f46c6e9882d9da7977616a734c185d75128c18d082bc76178f3a93c2b1"} Mar 19 12:27:38.622748 master-0 kubenswrapper[29612]: I0319 12:27:38.615729 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7vbjj" event={"ID":"16a901b3-60ab-4fe0-ab8b-49d5b7c30b54","Type":"ContainerDied","Data":"58733214d62e6e6a42b71277b5a7ed4a76028cc679769aed8b32530247284762"} Mar 19 12:27:38.622748 master-0 kubenswrapper[29612]: I0319 12:27:38.615770 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58733214d62e6e6a42b71277b5a7ed4a76028cc679769aed8b32530247284762" Mar 19 12:27:38.622748 master-0 kubenswrapper[29612]: I0319 12:27:38.615773 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7vbjj" Mar 19 12:27:39.222577 master-0 kubenswrapper[29612]: I0319 12:27:39.222159 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857c69d577-2vlg7"] Mar 19 12:27:39.241855 master-0 kubenswrapper[29612]: I0319 12:27:39.241802 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849f55d89-wzvhq"] Mar 19 12:27:39.242488 master-0 kubenswrapper[29612]: E0319 12:27:39.242451 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" containerName="glance-db-sync" Mar 19 12:27:39.242488 master-0 kubenswrapper[29612]: I0319 12:27:39.242482 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" containerName="glance-db-sync" Mar 19 12:27:39.242819 master-0 kubenswrapper[29612]: I0319 12:27:39.242788 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" containerName="glance-db-sync" Mar 19 12:27:39.245095 master-0 kubenswrapper[29612]: I0319 12:27:39.245059 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.267449 master-0 kubenswrapper[29612]: I0319 12:27:39.267390 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849f55d89-wzvhq"] Mar 19 12:27:39.340281 master-0 kubenswrapper[29612]: I0319 12:27:39.340235 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:39.370894 master-0 kubenswrapper[29612]: I0319 12:27:39.367930 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-config\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.370894 master-0 kubenswrapper[29612]: I0319 12:27:39.368008 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqcjg\" (UniqueName: \"kubernetes.io/projected/fc878c70-3d43-4704-a687-4747aeec1855-kube-api-access-dqcjg\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.370894 master-0 kubenswrapper[29612]: I0319 12:27:39.368196 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-nb\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.370894 master-0 kubenswrapper[29612]: I0319 12:27:39.368276 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-svc\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.370894 master-0 kubenswrapper[29612]: I0319 12:27:39.368665 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-sb\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.370894 master-0 kubenswrapper[29612]: I0319 12:27:39.368772 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-swift-storage-0\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.472542 master-0 kubenswrapper[29612]: I0319 12:27:39.472497 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe770bd-975c-429f-9998-a4af416c5c1d-operator-scripts\") pod \"ffe770bd-975c-429f-9998-a4af416c5c1d\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " Mar 19 12:27:39.473427 master-0 kubenswrapper[29612]: I0319 12:27:39.473355 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbqs9\" (UniqueName: \"kubernetes.io/projected/ffe770bd-975c-429f-9998-a4af416c5c1d-kube-api-access-gbqs9\") pod \"ffe770bd-975c-429f-9998-a4af416c5c1d\" (UID: \"ffe770bd-975c-429f-9998-a4af416c5c1d\") " Mar 19 12:27:39.477977 master-0 kubenswrapper[29612]: I0319 12:27:39.474171 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffe770bd-975c-429f-9998-a4af416c5c1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffe770bd-975c-429f-9998-a4af416c5c1d" (UID: "ffe770bd-975c-429f-9998-a4af416c5c1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:39.479363 master-0 kubenswrapper[29612]: I0319 12:27:39.479241 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-config\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.479702 master-0 kubenswrapper[29612]: I0319 12:27:39.479684 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqcjg\" (UniqueName: \"kubernetes.io/projected/fc878c70-3d43-4704-a687-4747aeec1855-kube-api-access-dqcjg\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.480224 master-0 kubenswrapper[29612]: I0319 12:27:39.480207 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-nb\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.480338 master-0 kubenswrapper[29612]: I0319 12:27:39.480309 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-svc\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.481073 master-0 kubenswrapper[29612]: I0319 12:27:39.480593 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-sb\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.481239 master-0 kubenswrapper[29612]: I0319 12:27:39.481210 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-swift-storage-0\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.481470 master-0 kubenswrapper[29612]: I0319 12:27:39.481456 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffe770bd-975c-429f-9998-a4af416c5c1d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:39.481539 master-0 kubenswrapper[29612]: I0319 12:27:39.481478 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe770bd-975c-429f-9998-a4af416c5c1d-kube-api-access-gbqs9" (OuterVolumeSpecName: "kube-api-access-gbqs9") pod "ffe770bd-975c-429f-9998-a4af416c5c1d" (UID: "ffe770bd-975c-429f-9998-a4af416c5c1d"). InnerVolumeSpecName "kube-api-access-gbqs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:39.483056 master-0 kubenswrapper[29612]: I0319 12:27:39.482388 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-nb\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.483323 master-0 kubenswrapper[29612]: I0319 12:27:39.483278 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-config\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.483601 master-0 kubenswrapper[29612]: I0319 12:27:39.483583 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-svc\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.484418 master-0 kubenswrapper[29612]: I0319 12:27:39.484392 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-swift-storage-0\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.491077 master-0 kubenswrapper[29612]: I0319 12:27:39.491040 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-sb\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.518671 master-0 kubenswrapper[29612]: I0319 12:27:39.517625 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqcjg\" (UniqueName: \"kubernetes.io/projected/fc878c70-3d43-4704-a687-4747aeec1855-kube-api-access-dqcjg\") pod \"dnsmasq-dns-849f55d89-wzvhq\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.585819 master-0 kubenswrapper[29612]: I0319 12:27:39.585759 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbqs9\" (UniqueName: \"kubernetes.io/projected/ffe770bd-975c-429f-9998-a4af416c5c1d-kube-api-access-gbqs9\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:39.622960 master-0 kubenswrapper[29612]: I0319 12:27:39.622770 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:39.662337 master-0 kubenswrapper[29612]: I0319 12:27:39.661275 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vjfrd" event={"ID":"ffe770bd-975c-429f-9998-a4af416c5c1d","Type":"ContainerDied","Data":"51b0833c74071b3b14f4eb2658e9419eaf32fc331abc33d547741936b3d12077"} Mar 19 12:27:39.662337 master-0 kubenswrapper[29612]: I0319 12:27:39.661331 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b0833c74071b3b14f4eb2658e9419eaf32fc331abc33d547741936b3d12077" Mar 19 12:27:39.662337 master-0 kubenswrapper[29612]: I0319 12:27:39.661412 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vjfrd" Mar 19 12:27:39.703671 master-0 kubenswrapper[29612]: I0319 12:27:39.703598 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" event={"ID":"99613eeb-1aae-43ea-a24a-6b800e4fed37","Type":"ContainerStarted","Data":"5c2589414ca5d5e233ed568917a176311f9e92ab398d7c49c7c938ce1d84910f"} Mar 19 12:27:39.708947 master-0 kubenswrapper[29612]: I0319 12:27:39.706420 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:39.747771 master-0 kubenswrapper[29612]: I0319 12:27:39.747297 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" podStartSLOduration=3.747275907 podStartE2EDuration="3.747275907s" podCreationTimestamp="2026-03-19 12:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:27:39.733451874 +0000 UTC m=+1067.047620915" watchObservedRunningTime="2026-03-19 12:27:39.747275907 +0000 UTC m=+1067.061444928" Mar 19 12:27:39.824624 master-0 kubenswrapper[29612]: I0319 12:27:39.824586 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:39.840857 master-0 kubenswrapper[29612]: I0319 12:27:39.840733 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:39.849133 master-0 kubenswrapper[29612]: I0319 12:27:39.849083 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:39.905722 master-0 kubenswrapper[29612]: I0319 12:27:39.905649 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08724d0d-69ad-4a95-98ae-0d0707bbaee8-operator-scripts\") pod \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " Mar 19 12:27:39.905940 master-0 kubenswrapper[29612]: I0319 12:27:39.905745 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c524a81-fbea-432b-856d-1688012185e2-operator-scripts\") pod \"6c524a81-fbea-432b-856d-1688012185e2\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " Mar 19 12:27:39.905940 master-0 kubenswrapper[29612]: I0319 12:27:39.905772 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppmfz\" (UniqueName: \"kubernetes.io/projected/08724d0d-69ad-4a95-98ae-0d0707bbaee8-kube-api-access-ppmfz\") pod \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\" (UID: \"08724d0d-69ad-4a95-98ae-0d0707bbaee8\") " Mar 19 12:27:39.909713 master-0 kubenswrapper[29612]: I0319 12:27:39.906127 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08724d0d-69ad-4a95-98ae-0d0707bbaee8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08724d0d-69ad-4a95-98ae-0d0707bbaee8" (UID: "08724d0d-69ad-4a95-98ae-0d0707bbaee8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:39.909713 master-0 kubenswrapper[29612]: I0319 12:27:39.906622 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c524a81-fbea-432b-856d-1688012185e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c524a81-fbea-432b-856d-1688012185e2" (UID: "6c524a81-fbea-432b-856d-1688012185e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:39.921463 master-0 kubenswrapper[29612]: I0319 12:27:39.913974 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08724d0d-69ad-4a95-98ae-0d0707bbaee8-kube-api-access-ppmfz" (OuterVolumeSpecName: "kube-api-access-ppmfz") pod "08724d0d-69ad-4a95-98ae-0d0707bbaee8" (UID: "08724d0d-69ad-4a95-98ae-0d0707bbaee8"). InnerVolumeSpecName "kube-api-access-ppmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:40.008111 master-0 kubenswrapper[29612]: I0319 12:27:40.007435 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzdw\" (UniqueName: \"kubernetes.io/projected/0ccd0fb5-3e98-4209-ac70-c238a7610668-kube-api-access-fqzdw\") pod \"0ccd0fb5-3e98-4209-ac70-c238a7610668\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " Mar 19 12:27:40.008111 master-0 kubenswrapper[29612]: I0319 12:27:40.007574 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ccd0fb5-3e98-4209-ac70-c238a7610668-operator-scripts\") pod \"0ccd0fb5-3e98-4209-ac70-c238a7610668\" (UID: \"0ccd0fb5-3e98-4209-ac70-c238a7610668\") " Mar 19 12:27:40.008111 master-0 kubenswrapper[29612]: I0319 12:27:40.007651 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdm5h\" (UniqueName: \"kubernetes.io/projected/6c524a81-fbea-432b-856d-1688012185e2-kube-api-access-vdm5h\") pod \"6c524a81-fbea-432b-856d-1688012185e2\" (UID: \"6c524a81-fbea-432b-856d-1688012185e2\") " Mar 19 12:27:40.008111 master-0 kubenswrapper[29612]: I0319 12:27:40.008110 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08724d0d-69ad-4a95-98ae-0d0707bbaee8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:40.008111 master-0 kubenswrapper[29612]: I0319 12:27:40.008127 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c524a81-fbea-432b-856d-1688012185e2-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:40.008523 master-0 kubenswrapper[29612]: I0319 12:27:40.008137 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppmfz\" (UniqueName: \"kubernetes.io/projected/08724d0d-69ad-4a95-98ae-0d0707bbaee8-kube-api-access-ppmfz\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:40.008835 master-0 kubenswrapper[29612]: I0319 12:27:40.008691 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ccd0fb5-3e98-4209-ac70-c238a7610668-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ccd0fb5-3e98-4209-ac70-c238a7610668" (UID: "0ccd0fb5-3e98-4209-ac70-c238a7610668"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:40.018967 master-0 kubenswrapper[29612]: I0319 12:27:40.018880 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c524a81-fbea-432b-856d-1688012185e2-kube-api-access-vdm5h" (OuterVolumeSpecName: "kube-api-access-vdm5h") pod "6c524a81-fbea-432b-856d-1688012185e2" (UID: "6c524a81-fbea-432b-856d-1688012185e2"). InnerVolumeSpecName "kube-api-access-vdm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:40.019564 master-0 kubenswrapper[29612]: I0319 12:27:40.019513 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ccd0fb5-3e98-4209-ac70-c238a7610668-kube-api-access-fqzdw" (OuterVolumeSpecName: "kube-api-access-fqzdw") pod "0ccd0fb5-3e98-4209-ac70-c238a7610668" (UID: "0ccd0fb5-3e98-4209-ac70-c238a7610668"). InnerVolumeSpecName "kube-api-access-fqzdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:40.110081 master-0 kubenswrapper[29612]: I0319 12:27:40.110026 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzdw\" (UniqueName: \"kubernetes.io/projected/0ccd0fb5-3e98-4209-ac70-c238a7610668-kube-api-access-fqzdw\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:40.110081 master-0 kubenswrapper[29612]: I0319 12:27:40.110072 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ccd0fb5-3e98-4209-ac70-c238a7610668-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:40.110081 master-0 kubenswrapper[29612]: I0319 12:27:40.110083 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdm5h\" (UniqueName: \"kubernetes.io/projected/6c524a81-fbea-432b-856d-1688012185e2-kube-api-access-vdm5h\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:40.343662 master-0 kubenswrapper[29612]: I0319 12:27:40.336202 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849f55d89-wzvhq"] Mar 19 12:27:40.731551 master-0 kubenswrapper[29612]: I0319 12:27:40.731489 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a786-account-create-update-q6mp4" event={"ID":"0ccd0fb5-3e98-4209-ac70-c238a7610668","Type":"ContainerDied","Data":"01dc9a4a48cd85c0a8591bccd5eb9e82d203d379e1852ab669d289afe7ba42b5"} Mar 19 12:27:40.731551 master-0 kubenswrapper[29612]: I0319 12:27:40.731540 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01dc9a4a48cd85c0a8591bccd5eb9e82d203d379e1852ab669d289afe7ba42b5" Mar 19 12:27:40.732153 master-0 kubenswrapper[29612]: I0319 12:27:40.731558 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a786-account-create-update-q6mp4" Mar 19 12:27:40.735179 master-0 kubenswrapper[29612]: I0319 12:27:40.735123 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7sxqt" event={"ID":"08724d0d-69ad-4a95-98ae-0d0707bbaee8","Type":"ContainerDied","Data":"af7e618ac4bc1fe5ec2e4a79d56b07f4faa4e844f9e43a4633a0e423a9426367"} Mar 19 12:27:40.735285 master-0 kubenswrapper[29612]: I0319 12:27:40.735182 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af7e618ac4bc1fe5ec2e4a79d56b07f4faa4e844f9e43a4633a0e423a9426367" Mar 19 12:27:40.735285 master-0 kubenswrapper[29612]: I0319 12:27:40.735275 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7sxqt" Mar 19 12:27:40.739352 master-0 kubenswrapper[29612]: I0319 12:27:40.738798 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerName="dnsmasq-dns" containerID="cri-o://5c2589414ca5d5e233ed568917a176311f9e92ab398d7c49c7c938ce1d84910f" gracePeriod=10 Mar 19 12:27:40.739352 master-0 kubenswrapper[29612]: I0319 12:27:40.738944 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9127-account-create-update-b86lg" Mar 19 12:27:40.741384 master-0 kubenswrapper[29612]: I0319 12:27:40.740760 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9127-account-create-update-b86lg" event={"ID":"6c524a81-fbea-432b-856d-1688012185e2","Type":"ContainerDied","Data":"1678371d521e77effa2cba41846b4492327816734f6965ec6a564cd937c6c091"} Mar 19 12:27:40.741384 master-0 kubenswrapper[29612]: I0319 12:27:40.740792 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1678371d521e77effa2cba41846b4492327816734f6965ec6a564cd937c6c091" Mar 19 12:27:44.809449 master-0 kubenswrapper[29612]: I0319 12:27:44.809289 29612 generic.go:334] "Generic (PLEG): container finished" podID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerID="5c2589414ca5d5e233ed568917a176311f9e92ab398d7c49c7c938ce1d84910f" exitCode=0 Mar 19 12:27:44.809449 master-0 kubenswrapper[29612]: I0319 12:27:44.809373 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" event={"ID":"99613eeb-1aae-43ea-a24a-6b800e4fed37","Type":"ContainerDied","Data":"5c2589414ca5d5e233ed568917a176311f9e92ab398d7c49c7c938ce1d84910f"} Mar 19 12:27:44.811503 master-0 kubenswrapper[29612]: I0319 12:27:44.811324 29612 generic.go:334] "Generic (PLEG): container finished" podID="fc878c70-3d43-4704-a687-4747aeec1855" containerID="3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9" exitCode=0 Mar 19 12:27:44.811599 master-0 kubenswrapper[29612]: I0319 12:27:44.811399 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" event={"ID":"fc878c70-3d43-4704-a687-4747aeec1855","Type":"ContainerDied","Data":"3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9"} Mar 19 12:27:44.811599 master-0 kubenswrapper[29612]: I0319 12:27:44.811563 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" event={"ID":"fc878c70-3d43-4704-a687-4747aeec1855","Type":"ContainerStarted","Data":"5433ac51b27cc59bba9d505c52df311dd7f455ab7af1ebfcc4b18f2df7597846"} Mar 19 12:27:44.974040 master-0 kubenswrapper[29612]: I0319 12:27:44.974002 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:45.158335 master-0 kubenswrapper[29612]: I0319 12:27:45.156492 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-nb\") pod \"99613eeb-1aae-43ea-a24a-6b800e4fed37\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " Mar 19 12:27:45.158335 master-0 kubenswrapper[29612]: I0319 12:27:45.156567 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-svc\") pod \"99613eeb-1aae-43ea-a24a-6b800e4fed37\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " Mar 19 12:27:45.158335 master-0 kubenswrapper[29612]: I0319 12:27:45.156604 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjkl4\" (UniqueName: \"kubernetes.io/projected/99613eeb-1aae-43ea-a24a-6b800e4fed37-kube-api-access-kjkl4\") pod \"99613eeb-1aae-43ea-a24a-6b800e4fed37\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " Mar 19 12:27:45.158335 master-0 kubenswrapper[29612]: I0319 12:27:45.156816 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-config\") pod \"99613eeb-1aae-43ea-a24a-6b800e4fed37\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " Mar 19 12:27:45.158335 master-0 kubenswrapper[29612]: I0319 12:27:45.156848 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-swift-storage-0\") pod \"99613eeb-1aae-43ea-a24a-6b800e4fed37\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " Mar 19 12:27:45.158335 master-0 kubenswrapper[29612]: I0319 12:27:45.156905 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-sb\") pod \"99613eeb-1aae-43ea-a24a-6b800e4fed37\" (UID: \"99613eeb-1aae-43ea-a24a-6b800e4fed37\") " Mar 19 12:27:45.167488 master-0 kubenswrapper[29612]: I0319 12:27:45.167332 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99613eeb-1aae-43ea-a24a-6b800e4fed37-kube-api-access-kjkl4" (OuterVolumeSpecName: "kube-api-access-kjkl4") pod "99613eeb-1aae-43ea-a24a-6b800e4fed37" (UID: "99613eeb-1aae-43ea-a24a-6b800e4fed37"). InnerVolumeSpecName "kube-api-access-kjkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:45.232201 master-0 kubenswrapper[29612]: I0319 12:27:45.230138 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99613eeb-1aae-43ea-a24a-6b800e4fed37" (UID: "99613eeb-1aae-43ea-a24a-6b800e4fed37"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:45.241729 master-0 kubenswrapper[29612]: I0319 12:27:45.237256 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-config" (OuterVolumeSpecName: "config") pod "99613eeb-1aae-43ea-a24a-6b800e4fed37" (UID: "99613eeb-1aae-43ea-a24a-6b800e4fed37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:45.243382 master-0 kubenswrapper[29612]: I0319 12:27:45.243317 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99613eeb-1aae-43ea-a24a-6b800e4fed37" (UID: "99613eeb-1aae-43ea-a24a-6b800e4fed37"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:45.245198 master-0 kubenswrapper[29612]: I0319 12:27:45.245146 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99613eeb-1aae-43ea-a24a-6b800e4fed37" (UID: "99613eeb-1aae-43ea-a24a-6b800e4fed37"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:45.249229 master-0 kubenswrapper[29612]: I0319 12:27:45.249194 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99613eeb-1aae-43ea-a24a-6b800e4fed37" (UID: "99613eeb-1aae-43ea-a24a-6b800e4fed37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:45.259556 master-0 kubenswrapper[29612]: I0319 12:27:45.259504 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:45.259556 master-0 kubenswrapper[29612]: I0319 12:27:45.259545 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:45.259723 master-0 kubenswrapper[29612]: I0319 12:27:45.259560 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:45.259723 master-0 kubenswrapper[29612]: I0319 12:27:45.259571 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:45.259723 master-0 kubenswrapper[29612]: I0319 12:27:45.259581 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99613eeb-1aae-43ea-a24a-6b800e4fed37-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:45.259723 master-0 kubenswrapper[29612]: I0319 12:27:45.259590 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjkl4\" (UniqueName: \"kubernetes.io/projected/99613eeb-1aae-43ea-a24a-6b800e4fed37-kube-api-access-kjkl4\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:45.822887 master-0 kubenswrapper[29612]: I0319 12:27:45.822817 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" event={"ID":"fc878c70-3d43-4704-a687-4747aeec1855","Type":"ContainerStarted","Data":"27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069"} Mar 19 12:27:45.823554 master-0 kubenswrapper[29612]: I0319 12:27:45.823311 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:45.824391 master-0 kubenswrapper[29612]: I0319 12:27:45.824362 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-26vqt" event={"ID":"2ef05b5e-8998-4c2c-a732-84cc8345df7c","Type":"ContainerStarted","Data":"1b5dad74840942e4694828977c1332ffaebc778ea206c3f987a9d1131f795c5f"} Mar 19 12:27:45.826057 master-0 kubenswrapper[29612]: I0319 12:27:45.826027 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" event={"ID":"99613eeb-1aae-43ea-a24a-6b800e4fed37","Type":"ContainerDied","Data":"bf7fc6f46c6e9882d9da7977616a734c185d75128c18d082bc76178f3a93c2b1"} Mar 19 12:27:45.826184 master-0 kubenswrapper[29612]: I0319 12:27:45.826062 29612 scope.go:117] "RemoveContainer" containerID="5c2589414ca5d5e233ed568917a176311f9e92ab398d7c49c7c938ce1d84910f" Mar 19 12:27:45.826184 master-0 kubenswrapper[29612]: I0319 12:27:45.826070 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-857c69d577-2vlg7" Mar 19 12:27:45.856557 master-0 kubenswrapper[29612]: I0319 12:27:45.856496 29612 scope.go:117] "RemoveContainer" containerID="a61364302f5bf365dd1688523771a5125afdbed22e0b4b91c61ee550ab76ff4a" Mar 19 12:27:45.867041 master-0 kubenswrapper[29612]: I0319 12:27:45.866295 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" podStartSLOduration=6.866275357 podStartE2EDuration="6.866275357s" podCreationTimestamp="2026-03-19 12:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:27:45.851273941 +0000 UTC m=+1073.165442972" watchObservedRunningTime="2026-03-19 12:27:45.866275357 +0000 UTC m=+1073.180444388" Mar 19 12:27:45.887912 master-0 kubenswrapper[29612]: I0319 12:27:45.886228 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-857c69d577-2vlg7"] Mar 19 12:27:45.896185 master-0 kubenswrapper[29612]: I0319 12:27:45.896118 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-857c69d577-2vlg7"] Mar 19 12:27:45.907367 master-0 kubenswrapper[29612]: I0319 12:27:45.907160 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-26vqt" podStartSLOduration=2.772247312 podStartE2EDuration="10.907138478s" podCreationTimestamp="2026-03-19 12:27:35 +0000 UTC" firstStartedPulling="2026-03-19 12:27:36.837365237 +0000 UTC m=+1064.151534258" lastFinishedPulling="2026-03-19 12:27:44.972256403 +0000 UTC m=+1072.286425424" observedRunningTime="2026-03-19 12:27:45.90015004 +0000 UTC m=+1073.214319061" watchObservedRunningTime="2026-03-19 12:27:45.907138478 +0000 UTC m=+1073.221307499" Mar 19 12:27:46.923791 master-0 kubenswrapper[29612]: I0319 12:27:46.923138 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" path="/var/lib/kubelet/pods/99613eeb-1aae-43ea-a24a-6b800e4fed37/volumes" Mar 19 12:27:49.886949 master-0 kubenswrapper[29612]: I0319 12:27:49.886865 29612 generic.go:334] "Generic (PLEG): container finished" podID="2ef05b5e-8998-4c2c-a732-84cc8345df7c" containerID="1b5dad74840942e4694828977c1332ffaebc778ea206c3f987a9d1131f795c5f" exitCode=0 Mar 19 12:27:49.886949 master-0 kubenswrapper[29612]: I0319 12:27:49.886945 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-26vqt" event={"ID":"2ef05b5e-8998-4c2c-a732-84cc8345df7c","Type":"ContainerDied","Data":"1b5dad74840942e4694828977c1332ffaebc778ea206c3f987a9d1131f795c5f"} Mar 19 12:27:51.385798 master-0 kubenswrapper[29612]: I0319 12:27:51.385732 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:51.418093 master-0 kubenswrapper[29612]: I0319 12:27:51.418037 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-config-data\") pod \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " Mar 19 12:27:51.418270 master-0 kubenswrapper[29612]: I0319 12:27:51.418213 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt89q\" (UniqueName: \"kubernetes.io/projected/2ef05b5e-8998-4c2c-a732-84cc8345df7c-kube-api-access-xt89q\") pod \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " Mar 19 12:27:51.418352 master-0 kubenswrapper[29612]: I0319 12:27:51.418292 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-combined-ca-bundle\") pod \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\" (UID: \"2ef05b5e-8998-4c2c-a732-84cc8345df7c\") " Mar 19 12:27:51.421925 master-0 kubenswrapper[29612]: I0319 12:27:51.421885 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef05b5e-8998-4c2c-a732-84cc8345df7c-kube-api-access-xt89q" (OuterVolumeSpecName: "kube-api-access-xt89q") pod "2ef05b5e-8998-4c2c-a732-84cc8345df7c" (UID: "2ef05b5e-8998-4c2c-a732-84cc8345df7c"). InnerVolumeSpecName "kube-api-access-xt89q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:51.452098 master-0 kubenswrapper[29612]: I0319 12:27:51.452041 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ef05b5e-8998-4c2c-a732-84cc8345df7c" (UID: "2ef05b5e-8998-4c2c-a732-84cc8345df7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:51.472774 master-0 kubenswrapper[29612]: I0319 12:27:51.472550 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-config-data" (OuterVolumeSpecName: "config-data") pod "2ef05b5e-8998-4c2c-a732-84cc8345df7c" (UID: "2ef05b5e-8998-4c2c-a732-84cc8345df7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:27:51.521551 master-0 kubenswrapper[29612]: I0319 12:27:51.521487 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:51.521551 master-0 kubenswrapper[29612]: I0319 12:27:51.521537 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt89q\" (UniqueName: \"kubernetes.io/projected/2ef05b5e-8998-4c2c-a732-84cc8345df7c-kube-api-access-xt89q\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:51.521551 master-0 kubenswrapper[29612]: I0319 12:27:51.521550 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ef05b5e-8998-4c2c-a732-84cc8345df7c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:51.912282 master-0 kubenswrapper[29612]: I0319 12:27:51.912218 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-26vqt" event={"ID":"2ef05b5e-8998-4c2c-a732-84cc8345df7c","Type":"ContainerDied","Data":"5bbeb91bb5251a83cf2a08a1c720b8fad7de4d1b7ee7e949c3f0fe94fd1f3755"} Mar 19 12:27:51.912282 master-0 kubenswrapper[29612]: I0319 12:27:51.912272 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbeb91bb5251a83cf2a08a1c720b8fad7de4d1b7ee7e949c3f0fe94fd1f3755" Mar 19 12:27:51.912574 master-0 kubenswrapper[29612]: I0319 12:27:51.912336 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-26vqt" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.786687 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mh8mj"] Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787320 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ccd0fb5-3e98-4209-ac70-c238a7610668" containerName="mariadb-account-create-update" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787336 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ccd0fb5-3e98-4209-ac70-c238a7610668" containerName="mariadb-account-create-update" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787373 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerName="dnsmasq-dns" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787381 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerName="dnsmasq-dns" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787405 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c524a81-fbea-432b-856d-1688012185e2" containerName="mariadb-account-create-update" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787415 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c524a81-fbea-432b-856d-1688012185e2" containerName="mariadb-account-create-update" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787463 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerName="init" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787470 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerName="init" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787481 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef05b5e-8998-4c2c-a732-84cc8345df7c" containerName="keystone-db-sync" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787489 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef05b5e-8998-4c2c-a732-84cc8345df7c" containerName="keystone-db-sync" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787505 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08724d0d-69ad-4a95-98ae-0d0707bbaee8" containerName="mariadb-database-create" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787512 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="08724d0d-69ad-4a95-98ae-0d0707bbaee8" containerName="mariadb-database-create" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: E0319 12:27:53.787539 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe770bd-975c-429f-9998-a4af416c5c1d" containerName="mariadb-database-create" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787547 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe770bd-975c-429f-9998-a4af416c5c1d" containerName="mariadb-database-create" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787787 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ccd0fb5-3e98-4209-ac70-c238a7610668" containerName="mariadb-account-create-update" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787829 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c524a81-fbea-432b-856d-1688012185e2" containerName="mariadb-account-create-update" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787841 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe770bd-975c-429f-9998-a4af416c5c1d" containerName="mariadb-database-create" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787853 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="99613eeb-1aae-43ea-a24a-6b800e4fed37" containerName="dnsmasq-dns" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787877 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="08724d0d-69ad-4a95-98ae-0d0707bbaee8" containerName="mariadb-database-create" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.787895 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef05b5e-8998-4c2c-a732-84cc8345df7c" containerName="keystone-db-sync" Mar 19 12:27:53.789266 master-0 kubenswrapper[29612]: I0319 12:27:53.788734 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.797670 master-0 kubenswrapper[29612]: I0319 12:27:53.795108 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 12:27:53.822657 master-0 kubenswrapper[29612]: I0319 12:27:53.818722 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 12:27:53.822657 master-0 kubenswrapper[29612]: I0319 12:27:53.818912 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 12:27:53.822657 master-0 kubenswrapper[29612]: I0319 12:27:53.819155 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 12:27:53.855198 master-0 kubenswrapper[29612]: I0319 12:27:53.852356 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849f55d89-wzvhq"] Mar 19 12:27:53.855198 master-0 kubenswrapper[29612]: I0319 12:27:53.852768 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="dnsmasq-dns" containerID="cri-o://27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069" gracePeriod=10 Mar 19 12:27:53.855198 master-0 kubenswrapper[29612]: I0319 12:27:53.855043 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:53.905870 master-0 kubenswrapper[29612]: I0319 12:27:53.897436 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmzh8\" (UniqueName: \"kubernetes.io/projected/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-kube-api-access-fmzh8\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.905870 master-0 kubenswrapper[29612]: I0319 12:27:53.897700 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-config-data\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.905870 master-0 kubenswrapper[29612]: I0319 12:27:53.897771 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-credential-keys\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.905870 master-0 kubenswrapper[29612]: I0319 12:27:53.897888 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-fernet-keys\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.905870 master-0 kubenswrapper[29612]: I0319 12:27:53.897912 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-scripts\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.905870 master-0 kubenswrapper[29612]: I0319 12:27:53.897931 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-combined-ca-bundle\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:53.988120 master-0 kubenswrapper[29612]: I0319 12:27:53.987104 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mh8mj"] Mar 19 12:27:54.045330 master-0 kubenswrapper[29612]: I0319 12:27:54.039527 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmzh8\" (UniqueName: \"kubernetes.io/projected/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-kube-api-access-fmzh8\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.045330 master-0 kubenswrapper[29612]: I0319 12:27:54.041050 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-config-data\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.045330 master-0 kubenswrapper[29612]: I0319 12:27:54.041145 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-credential-keys\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.045330 master-0 kubenswrapper[29612]: I0319 12:27:54.041692 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-fernet-keys\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.045330 master-0 kubenswrapper[29612]: I0319 12:27:54.041724 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-scripts\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.045330 master-0 kubenswrapper[29612]: I0319 12:27:54.041759 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-combined-ca-bundle\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.065002 master-0 kubenswrapper[29612]: I0319 12:27:54.049788 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-combined-ca-bundle\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.065002 master-0 kubenswrapper[29612]: I0319 12:27:54.049881 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59fcdff467-6j2bl"] Mar 19 12:27:54.088663 master-0 kubenswrapper[29612]: I0319 12:27:54.080422 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fcdff467-6j2bl"] Mar 19 12:27:54.088663 master-0 kubenswrapper[29612]: I0319 12:27:54.080555 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.105659 master-0 kubenswrapper[29612]: I0319 12:27:54.104832 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-scripts\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.105659 master-0 kubenswrapper[29612]: I0319 12:27:54.105131 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-fernet-keys\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.105659 master-0 kubenswrapper[29612]: I0319 12:27:54.105350 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-config-data\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.105659 master-0 kubenswrapper[29612]: I0319 12:27:54.105617 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-credential-keys\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.126374 master-0 kubenswrapper[29612]: I0319 12:27:54.125974 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmzh8\" (UniqueName: \"kubernetes.io/projected/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-kube-api-access-fmzh8\") pod \"keystone-bootstrap-mh8mj\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.154694 master-0 kubenswrapper[29612]: I0319 12:27:54.153737 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-6mnv4"] Mar 19 12:27:54.193018 master-0 kubenswrapper[29612]: I0319 12:27:54.192949 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mh455"] Mar 19 12:27:54.194767 master-0 kubenswrapper[29612]: I0319 12:27:54.194582 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-db-sync-dx9n4"] Mar 19 12:27:54.196494 master-0 kubenswrapper[29612]: I0319 12:27:54.195948 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.198915 master-0 kubenswrapper[29612]: I0319 12:27:54.198885 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.200856 master-0 kubenswrapper[29612]: I0319 12:27:54.200832 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 12:27:54.201848 master-0 kubenswrapper[29612]: I0319 12:27:54.200995 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 12:27:54.204619 master-0 kubenswrapper[29612]: I0319 12:27:54.204442 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:27:54.227001 master-0 kubenswrapper[29612]: I0319 12:27:54.226943 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-6mnv4"] Mar 19 12:27:54.227001 master-0 kubenswrapper[29612]: I0319 12:27:54.226998 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mh455"] Mar 19 12:27:54.227001 master-0 kubenswrapper[29612]: I0319 12:27:54.227012 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-db-sync-dx9n4"] Mar 19 12:27:54.227268 master-0 kubenswrapper[29612]: I0319 12:27:54.227101 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.245177 master-0 kubenswrapper[29612]: I0319 12:27:54.245124 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-config-data" Mar 19 12:27:54.245508 master-0 kubenswrapper[29612]: I0319 12:27:54.245178 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-8ba2-account-create-update-24xjd"] Mar 19 12:27:54.245818 master-0 kubenswrapper[29612]: I0319 12:27:54.245343 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-scripts" Mar 19 12:27:54.247104 master-0 kubenswrapper[29612]: I0319 12:27:54.247060 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.250133 master-0 kubenswrapper[29612]: I0319 12:27:54.250076 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-nb\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.250133 master-0 kubenswrapper[29612]: I0319 12:27:54.250132 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-sb\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.250322 master-0 kubenswrapper[29612]: I0319 12:27:54.250293 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-config\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.251293 master-0 kubenswrapper[29612]: I0319 12:27:54.251086 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 19 12:27:54.255880 master-0 kubenswrapper[29612]: I0319 12:27:54.254794 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2p2r\" (UniqueName: \"kubernetes.io/projected/3e6837c4-0119-40f6-83e9-f0ea203b3781-kube-api-access-l2p2r\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.255880 master-0 kubenswrapper[29612]: I0319 12:27:54.255036 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-svc\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.255880 master-0 kubenswrapper[29612]: I0319 12:27:54.255126 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-swift-storage-0\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.340655 master-0 kubenswrapper[29612]: I0319 12:27:54.324557 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-8ba2-account-create-update-24xjd"] Mar 19 12:27:54.358159 master-0 kubenswrapper[29612]: I0319 12:27:54.358076 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkzgh\" (UniqueName: \"kubernetes.io/projected/9f8cc63c-1080-41d8-b789-bce87f57e603-kube-api-access-mkzgh\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.358159 master-0 kubenswrapper[29612]: I0319 12:27:54.358159 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-config-data\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.358389 master-0 kubenswrapper[29612]: I0319 12:27:54.358207 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phlkz\" (UniqueName: \"kubernetes.io/projected/1d744992-9162-4633-9614-ed0c4ac81988-kube-api-access-phlkz\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.358389 master-0 kubenswrapper[29612]: I0319 12:27:54.358257 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27627ab5-718d-4b5c-ba01-9e7c7da19223-operator-scripts\") pod \"ironic-8ba2-account-create-update-24xjd\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.358389 master-0 kubenswrapper[29612]: I0319 12:27:54.358290 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-scripts\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.358389 master-0 kubenswrapper[29612]: I0319 12:27:54.358326 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f8cc63c-1080-41d8-b789-bce87f57e603-etc-machine-id\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.358389 master-0 kubenswrapper[29612]: I0319 12:27:54.358365 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-config\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.358548 master-0 kubenswrapper[29612]: I0319 12:27:54.358408 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2p2r\" (UniqueName: \"kubernetes.io/projected/3e6837c4-0119-40f6-83e9-f0ea203b3781-kube-api-access-l2p2r\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.358548 master-0 kubenswrapper[29612]: I0319 12:27:54.358436 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkxb8\" (UniqueName: \"kubernetes.io/projected/27627ab5-718d-4b5c-ba01-9e7c7da19223-kube-api-access-gkxb8\") pod \"ironic-8ba2-account-create-update-24xjd\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.358548 master-0 kubenswrapper[29612]: I0319 12:27:54.358504 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e893922b-55bb-4c15-84ed-4a005cffe0d2-operator-scripts\") pod \"ironic-db-create-6mnv4\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.358548 master-0 kubenswrapper[29612]: I0319 12:27:54.358544 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-svc\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.358697 master-0 kubenswrapper[29612]: I0319 12:27:54.358585 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-combined-ca-bundle\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.358697 master-0 kubenswrapper[29612]: I0319 12:27:54.358610 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-swift-storage-0\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.358758 master-0 kubenswrapper[29612]: I0319 12:27:54.358713 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-config\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.358758 master-0 kubenswrapper[29612]: I0319 12:27:54.358747 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-nb\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.358823 master-0 kubenswrapper[29612]: I0319 12:27:54.358780 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdql5\" (UniqueName: \"kubernetes.io/projected/e893922b-55bb-4c15-84ed-4a005cffe0d2-kube-api-access-mdql5\") pod \"ironic-db-create-6mnv4\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.358823 master-0 kubenswrapper[29612]: I0319 12:27:54.358807 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-sb\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.358882 master-0 kubenswrapper[29612]: I0319 12:27:54.358844 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-combined-ca-bundle\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.358882 master-0 kubenswrapper[29612]: I0319 12:27:54.358874 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-db-sync-config-data\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.361783 master-0 kubenswrapper[29612]: I0319 12:27:54.360495 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-config\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.361783 master-0 kubenswrapper[29612]: I0319 12:27:54.361682 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-svc\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.363842 master-0 kubenswrapper[29612]: I0319 12:27:54.362392 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-swift-storage-0\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.363842 master-0 kubenswrapper[29612]: I0319 12:27:54.363218 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-nb\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.364040 master-0 kubenswrapper[29612]: I0319 12:27:54.364009 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-sb\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.460758 master-0 kubenswrapper[29612]: I0319 12:27:54.460417 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phlkz\" (UniqueName: \"kubernetes.io/projected/1d744992-9162-4633-9614-ed0c4ac81988-kube-api-access-phlkz\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.460758 master-0 kubenswrapper[29612]: I0319 12:27:54.460492 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27627ab5-718d-4b5c-ba01-9e7c7da19223-operator-scripts\") pod \"ironic-8ba2-account-create-update-24xjd\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.460758 master-0 kubenswrapper[29612]: I0319 12:27:54.460528 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-scripts\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.460758 master-0 kubenswrapper[29612]: I0319 12:27:54.460564 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f8cc63c-1080-41d8-b789-bce87f57e603-etc-machine-id\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.460758 master-0 kubenswrapper[29612]: I0319 12:27:54.460614 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkxb8\" (UniqueName: \"kubernetes.io/projected/27627ab5-718d-4b5c-ba01-9e7c7da19223-kube-api-access-gkxb8\") pod \"ironic-8ba2-account-create-update-24xjd\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.460758 master-0 kubenswrapper[29612]: I0319 12:27:54.460704 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e893922b-55bb-4c15-84ed-4a005cffe0d2-operator-scripts\") pod \"ironic-db-create-6mnv4\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461274 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-combined-ca-bundle\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461309 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-config\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461345 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdql5\" (UniqueName: \"kubernetes.io/projected/e893922b-55bb-4c15-84ed-4a005cffe0d2-kube-api-access-mdql5\") pod \"ironic-db-create-6mnv4\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461384 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-combined-ca-bundle\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461410 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-db-sync-config-data\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461453 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzgh\" (UniqueName: \"kubernetes.io/projected/9f8cc63c-1080-41d8-b789-bce87f57e603-kube-api-access-mkzgh\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.461991 master-0 kubenswrapper[29612]: I0319 12:27:54.461482 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-config-data\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.462601 master-0 kubenswrapper[29612]: I0319 12:27:54.462542 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e893922b-55bb-4c15-84ed-4a005cffe0d2-operator-scripts\") pod \"ironic-db-create-6mnv4\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.467199 master-0 kubenswrapper[29612]: I0319 12:27:54.466784 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27627ab5-718d-4b5c-ba01-9e7c7da19223-operator-scripts\") pod \"ironic-8ba2-account-create-update-24xjd\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.467861 master-0 kubenswrapper[29612]: I0319 12:27:54.467697 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f8cc63c-1080-41d8-b789-bce87f57e603-etc-machine-id\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.468739 master-0 kubenswrapper[29612]: I0319 12:27:54.468717 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-config\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.471416 master-0 kubenswrapper[29612]: I0319 12:27:54.471367 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-db-sync-config-data\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.471630 master-0 kubenswrapper[29612]: I0319 12:27:54.471582 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-combined-ca-bundle\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.473263 master-0 kubenswrapper[29612]: I0319 12:27:54.473223 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-config-data\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.499584 master-0 kubenswrapper[29612]: I0319 12:27:54.499508 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-scripts\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.500226 master-0 kubenswrapper[29612]: I0319 12:27:54.500030 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-combined-ca-bundle\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.562255 master-0 kubenswrapper[29612]: I0319 12:27:54.560529 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2p2r\" (UniqueName: \"kubernetes.io/projected/3e6837c4-0119-40f6-83e9-f0ea203b3781-kube-api-access-l2p2r\") pod \"dnsmasq-dns-59fcdff467-6j2bl\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.567908 master-0 kubenswrapper[29612]: I0319 12:27:54.567292 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phlkz\" (UniqueName: \"kubernetes.io/projected/1d744992-9162-4633-9614-ed0c4ac81988-kube-api-access-phlkz\") pod \"neutron-db-sync-mh455\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.568112 master-0 kubenswrapper[29612]: I0319 12:27:54.567947 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdql5\" (UniqueName: \"kubernetes.io/projected/e893922b-55bb-4c15-84ed-4a005cffe0d2-kube-api-access-mdql5\") pod \"ironic-db-create-6mnv4\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.568951 master-0 kubenswrapper[29612]: I0319 12:27:54.568864 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzgh\" (UniqueName: \"kubernetes.io/projected/9f8cc63c-1080-41d8-b789-bce87f57e603-kube-api-access-mkzgh\") pod \"cinder-562cb-db-sync-dx9n4\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.584216 master-0 kubenswrapper[29612]: I0319 12:27:54.573532 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkxb8\" (UniqueName: \"kubernetes.io/projected/27627ab5-718d-4b5c-ba01-9e7c7da19223-kube-api-access-gkxb8\") pod \"ironic-8ba2-account-create-update-24xjd\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.680746 master-0 kubenswrapper[29612]: I0319 12:27:54.680658 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-h9cqc"] Mar 19 12:27:54.683739 master-0 kubenswrapper[29612]: I0319 12:27:54.682565 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.713498 master-0 kubenswrapper[29612]: I0319 12:27:54.712925 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:27:54.725274 master-0 kubenswrapper[29612]: I0319 12:27:54.724489 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 12:27:54.725274 master-0 kubenswrapper[29612]: I0319 12:27:54.724689 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 12:27:54.725274 master-0 kubenswrapper[29612]: I0319 12:27:54.724874 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-6mnv4" Mar 19 12:27:54.733852 master-0 kubenswrapper[29612]: I0319 12:27:54.733529 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mh455" Mar 19 12:27:54.749484 master-0 kubenswrapper[29612]: I0319 12:27:54.748384 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:27:54.757447 master-0 kubenswrapper[29612]: I0319 12:27:54.757310 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:27:54.780777 master-0 kubenswrapper[29612]: I0319 12:27:54.780725 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-config-data\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.780991 master-0 kubenswrapper[29612]: I0319 12:27:54.780963 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a976515-4dc5-45d4-b047-92e92de29df6-logs\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.781043 master-0 kubenswrapper[29612]: I0319 12:27:54.780997 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p5c4\" (UniqueName: \"kubernetes.io/projected/3a976515-4dc5-45d4-b047-92e92de29df6-kube-api-access-5p5c4\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.781102 master-0 kubenswrapper[29612]: I0319 12:27:54.781084 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-scripts\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.781162 master-0 kubenswrapper[29612]: I0319 12:27:54.781143 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-combined-ca-bundle\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.806390 master-0 kubenswrapper[29612]: I0319 12:27:54.805743 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fcdff467-6j2bl"] Mar 19 12:27:54.835248 master-0 kubenswrapper[29612]: I0319 12:27:54.834918 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:54.883482 master-0 kubenswrapper[29612]: I0319 12:27:54.883279 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h9cqc"] Mar 19 12:27:54.886860 master-0 kubenswrapper[29612]: I0319 12:27:54.886793 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-config-data\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.887084 master-0 kubenswrapper[29612]: I0319 12:27:54.887007 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a976515-4dc5-45d4-b047-92e92de29df6-logs\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.887084 master-0 kubenswrapper[29612]: I0319 12:27:54.887038 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p5c4\" (UniqueName: \"kubernetes.io/projected/3a976515-4dc5-45d4-b047-92e92de29df6-kube-api-access-5p5c4\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.887769 master-0 kubenswrapper[29612]: I0319 12:27:54.887266 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-scripts\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.887769 master-0 kubenswrapper[29612]: I0319 12:27:54.887341 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-combined-ca-bundle\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.887769 master-0 kubenswrapper[29612]: I0319 12:27:54.887577 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a976515-4dc5-45d4-b047-92e92de29df6-logs\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.900407 master-0 kubenswrapper[29612]: I0319 12:27:54.900344 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-648f9ff569-fh2n6"] Mar 19 12:27:54.900981 master-0 kubenswrapper[29612]: E0319 12:27:54.900879 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="init" Mar 19 12:27:54.900981 master-0 kubenswrapper[29612]: I0319 12:27:54.900900 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="init" Mar 19 12:27:54.900981 master-0 kubenswrapper[29612]: E0319 12:27:54.900949 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="dnsmasq-dns" Mar 19 12:27:54.900981 master-0 kubenswrapper[29612]: I0319 12:27:54.900956 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="dnsmasq-dns" Mar 19 12:27:54.901219 master-0 kubenswrapper[29612]: I0319 12:27:54.901129 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="dnsmasq-dns" Mar 19 12:27:54.904449 master-0 kubenswrapper[29612]: I0319 12:27:54.902092 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.921793 master-0 kubenswrapper[29612]: I0319 12:27:54.915196 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-config-data\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.925483 master-0 kubenswrapper[29612]: I0319 12:27:54.924934 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-combined-ca-bundle\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.947820 master-0 kubenswrapper[29612]: I0319 12:27:54.939073 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p5c4\" (UniqueName: \"kubernetes.io/projected/3a976515-4dc5-45d4-b047-92e92de29df6-kube-api-access-5p5c4\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.958373 master-0 kubenswrapper[29612]: I0319 12:27:54.958211 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-scripts\") pod \"placement-db-sync-h9cqc\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:54.976920 master-0 kubenswrapper[29612]: I0319 12:27:54.976309 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648f9ff569-fh2n6"] Mar 19 12:27:54.976920 master-0 kubenswrapper[29612]: I0319 12:27:54.976350 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mh8mj"] Mar 19 12:27:54.994408 master-0 kubenswrapper[29612]: I0319 12:27:54.994360 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-swift-storage-0\") pod \"fc878c70-3d43-4704-a687-4747aeec1855\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " Mar 19 12:27:54.994694 master-0 kubenswrapper[29612]: I0319 12:27:54.994678 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqcjg\" (UniqueName: \"kubernetes.io/projected/fc878c70-3d43-4704-a687-4747aeec1855-kube-api-access-dqcjg\") pod \"fc878c70-3d43-4704-a687-4747aeec1855\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " Mar 19 12:27:54.994835 master-0 kubenswrapper[29612]: I0319 12:27:54.994823 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-sb\") pod \"fc878c70-3d43-4704-a687-4747aeec1855\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " Mar 19 12:27:54.995019 master-0 kubenswrapper[29612]: I0319 12:27:54.995007 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-svc\") pod \"fc878c70-3d43-4704-a687-4747aeec1855\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " Mar 19 12:27:54.995122 master-0 kubenswrapper[29612]: I0319 12:27:54.995110 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-config\") pod \"fc878c70-3d43-4704-a687-4747aeec1855\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " Mar 19 12:27:54.997350 master-0 kubenswrapper[29612]: I0319 12:27:54.997332 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-nb\") pod \"fc878c70-3d43-4704-a687-4747aeec1855\" (UID: \"fc878c70-3d43-4704-a687-4747aeec1855\") " Mar 19 12:27:54.997889 master-0 kubenswrapper[29612]: I0319 12:27:54.997753 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zt4x\" (UniqueName: \"kubernetes.io/projected/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-kube-api-access-5zt4x\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.997984 master-0 kubenswrapper[29612]: I0319 12:27:54.997970 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.998077 master-0 kubenswrapper[29612]: I0319 12:27:54.998064 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-config\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.998441 master-0 kubenswrapper[29612]: I0319 12:27:54.998424 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.998925 master-0 kubenswrapper[29612]: I0319 12:27:54.998910 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.999099 master-0 kubenswrapper[29612]: I0319 12:27:54.999084 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-svc\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:54.999360 master-0 kubenswrapper[29612]: I0319 12:27:54.998936 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc878c70-3d43-4704-a687-4747aeec1855-kube-api-access-dqcjg" (OuterVolumeSpecName: "kube-api-access-dqcjg") pod "fc878c70-3d43-4704-a687-4747aeec1855" (UID: "fc878c70-3d43-4704-a687-4747aeec1855"). InnerVolumeSpecName "kube-api-access-dqcjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:27:55.043051 master-0 kubenswrapper[29612]: I0319 12:27:55.042286 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mh8mj" event={"ID":"97212ea2-7247-45e9-89e9-0bcf6d1acdf1","Type":"ContainerStarted","Data":"15c79f485ec4d90fd2ad4e8170edab01899f2ac9c9fd3d2a49b29b9a1e1505ea"} Mar 19 12:27:55.050737 master-0 kubenswrapper[29612]: I0319 12:27:55.050626 29612 generic.go:334] "Generic (PLEG): container finished" podID="fc878c70-3d43-4704-a687-4747aeec1855" containerID="27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069" exitCode=0 Mar 19 12:27:55.050737 master-0 kubenswrapper[29612]: I0319 12:27:55.050708 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" event={"ID":"fc878c70-3d43-4704-a687-4747aeec1855","Type":"ContainerDied","Data":"27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069"} Mar 19 12:27:55.050737 master-0 kubenswrapper[29612]: I0319 12:27:55.050726 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" Mar 19 12:27:55.051100 master-0 kubenswrapper[29612]: I0319 12:27:55.050750 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" event={"ID":"fc878c70-3d43-4704-a687-4747aeec1855","Type":"ContainerDied","Data":"5433ac51b27cc59bba9d505c52df311dd7f455ab7af1ebfcc4b18f2df7597846"} Mar 19 12:27:55.051100 master-0 kubenswrapper[29612]: I0319 12:27:55.050775 29612 scope.go:117] "RemoveContainer" containerID="27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069" Mar 19 12:27:55.097910 master-0 kubenswrapper[29612]: I0319 12:27:55.096570 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h9cqc" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.101236 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.101337 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-svc\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.102570 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.102600 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zt4x\" (UniqueName: \"kubernetes.io/projected/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-kube-api-access-5zt4x\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.102660 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-config\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.102894 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.103911 master-0 kubenswrapper[29612]: I0319 12:27:55.103020 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqcjg\" (UniqueName: \"kubernetes.io/projected/fc878c70-3d43-4704-a687-4747aeec1855-kube-api-access-dqcjg\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:55.107350 master-0 kubenswrapper[29612]: I0319 12:27:55.105280 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-nb\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.107350 master-0 kubenswrapper[29612]: I0319 12:27:55.105462 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-svc\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.107350 master-0 kubenswrapper[29612]: I0319 12:27:55.106067 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-swift-storage-0\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.107350 master-0 kubenswrapper[29612]: I0319 12:27:55.106580 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-sb\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.107350 master-0 kubenswrapper[29612]: I0319 12:27:55.106931 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-config\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.164046 master-0 kubenswrapper[29612]: I0319 12:27:55.157468 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc878c70-3d43-4704-a687-4747aeec1855" (UID: "fc878c70-3d43-4704-a687-4747aeec1855"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:55.194147 master-0 kubenswrapper[29612]: I0319 12:27:55.192470 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc878c70-3d43-4704-a687-4747aeec1855" (UID: "fc878c70-3d43-4704-a687-4747aeec1855"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:55.198172 master-0 kubenswrapper[29612]: I0319 12:27:55.198116 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zt4x\" (UniqueName: \"kubernetes.io/projected/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-kube-api-access-5zt4x\") pod \"dnsmasq-dns-648f9ff569-fh2n6\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.203360 master-0 kubenswrapper[29612]: I0319 12:27:55.202225 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc878c70-3d43-4704-a687-4747aeec1855" (UID: "fc878c70-3d43-4704-a687-4747aeec1855"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:55.207534 master-0 kubenswrapper[29612]: I0319 12:27:55.206594 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:55.207534 master-0 kubenswrapper[29612]: I0319 12:27:55.206642 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:55.207534 master-0 kubenswrapper[29612]: I0319 12:27:55.206654 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:55.218374 master-0 kubenswrapper[29612]: I0319 12:27:55.217789 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-config" (OuterVolumeSpecName: "config") pod "fc878c70-3d43-4704-a687-4747aeec1855" (UID: "fc878c70-3d43-4704-a687-4747aeec1855"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:55.227475 master-0 kubenswrapper[29612]: I0319 12:27:55.227426 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc878c70-3d43-4704-a687-4747aeec1855" (UID: "fc878c70-3d43-4704-a687-4747aeec1855"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:27:55.267792 master-0 kubenswrapper[29612]: I0319 12:27:55.267320 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:27:55.318143 master-0 kubenswrapper[29612]: I0319 12:27:55.317998 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:55.319692 master-0 kubenswrapper[29612]: I0319 12:27:55.319676 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc878c70-3d43-4704-a687-4747aeec1855-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:27:55.547132 master-0 kubenswrapper[29612]: I0319 12:27:55.546812 29612 scope.go:117] "RemoveContainer" containerID="3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9" Mar 19 12:27:55.579248 master-0 kubenswrapper[29612]: I0319 12:27:55.579214 29612 scope.go:117] "RemoveContainer" containerID="27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069" Mar 19 12:27:55.580688 master-0 kubenswrapper[29612]: E0319 12:27:55.580605 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069\": container with ID starting with 27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069 not found: ID does not exist" containerID="27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069" Mar 19 12:27:55.580688 master-0 kubenswrapper[29612]: I0319 12:27:55.580665 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069"} err="failed to get container status \"27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069\": rpc error: code = NotFound desc = could not find container \"27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069\": container with ID starting with 27a0f0c84ad0837302f64c2222b406e9bbb1c5495f3e1044526c89c2438e4069 not found: ID does not exist" Mar 19 12:27:55.580874 master-0 kubenswrapper[29612]: I0319 12:27:55.580698 29612 scope.go:117] "RemoveContainer" containerID="3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9" Mar 19 12:27:55.581179 master-0 kubenswrapper[29612]: E0319 12:27:55.581158 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9\": container with ID starting with 3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9 not found: ID does not exist" containerID="3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9" Mar 19 12:27:55.581247 master-0 kubenswrapper[29612]: I0319 12:27:55.581180 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9"} err="failed to get container status \"3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9\": rpc error: code = NotFound desc = could not find container \"3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9\": container with ID starting with 3c755c24de453a4540e67a4ed188e1d956e8449145acac3a717b01e04ec359e9 not found: ID does not exist" Mar 19 12:27:56.064602 master-0 kubenswrapper[29612]: I0319 12:27:56.064538 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mh8mj" event={"ID":"97212ea2-7247-45e9-89e9-0bcf6d1acdf1","Type":"ContainerStarted","Data":"621df10343b235aa08e0998d8695d227dc1134913b8bac93661c221c6841cd76"} Mar 19 12:27:56.297240 master-0 kubenswrapper[29612]: I0319 12:27:56.295330 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:27:56.298575 master-0 kubenswrapper[29612]: I0319 12:27:56.298527 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:56.303052 master-0 kubenswrapper[29612]: I0319 12:27:56.302285 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-default-external-config-data" Mar 19 12:27:56.306888 master-0 kubenswrapper[29612]: I0319 12:27:56.306494 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 12:27:56.306888 master-0 kubenswrapper[29612]: I0319 12:27:56.306514 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 12:27:58.302608 master-0 kubenswrapper[29612]: I0319 12:27:58.302547 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mh455"] Mar 19 12:27:58.331324 master-0 kubenswrapper[29612]: I0319 12:27:58.330372 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fcdff467-6j2bl"] Mar 19 12:27:58.404730 master-0 kubenswrapper[29612]: I0319 12:27:58.393751 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:27:58.463805 master-0 kubenswrapper[29612]: I0319 12:27:58.445859 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-6mnv4"] Mar 19 12:27:58.482756 master-0 kubenswrapper[29612]: I0319 12:27:58.479346 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-db-sync-dx9n4"] Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.518523 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.518711 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.518954 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.519084 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7z2\" (UniqueName: \"kubernetes.io/projected/a64bde84-ef1a-4c46-bd83-ea307dfd5430-kube-api-access-7b7z2\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.519138 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.519267 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.519345 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.551474 master-0 kubenswrapper[29612]: I0319 12:27:58.520515 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.576376 master-0 kubenswrapper[29612]: W0319 12:27:58.575158 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a976515_4dc5_45d4_b047_92e92de29df6.slice/crio-272134e6b14534039e98f0ebc3d4621869c16a5e9d33dc7bcdb9de16c5f126c5 WatchSource:0}: Error finding container 272134e6b14534039e98f0ebc3d4621869c16a5e9d33dc7bcdb9de16c5f126c5: Status 404 returned error can't find the container with id 272134e6b14534039e98f0ebc3d4621869c16a5e9d33dc7bcdb9de16c5f126c5 Mar 19 12:27:58.578309 master-0 kubenswrapper[29612]: I0319 12:27:58.577945 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849f55d89-wzvhq"] Mar 19 12:27:58.584861 master-0 kubenswrapper[29612]: W0319 12:27:58.583379 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9264c8b_fb67_4594_b8d4_4f66d1c85d0a.slice/crio-9852272d8fe63e88d3ded453ef2199f16f18365d2b1cb29e3121468b1efbdb7a WatchSource:0}: Error finding container 9852272d8fe63e88d3ded453ef2199f16f18365d2b1cb29e3121468b1efbdb7a: Status 404 returned error can't find the container with id 9852272d8fe63e88d3ded453ef2199f16f18365d2b1cb29e3121468b1efbdb7a Mar 19 12:27:58.587866 master-0 kubenswrapper[29612]: I0319 12:27:58.587825 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-8ba2-account-create-update-24xjd"] Mar 19 12:27:58.601097 master-0 kubenswrapper[29612]: I0319 12:27:58.598099 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648f9ff569-fh2n6"] Mar 19 12:27:58.607803 master-0 kubenswrapper[29612]: I0319 12:27:58.607747 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h9cqc"] Mar 19 12:27:58.622560 master-0 kubenswrapper[29612]: I0319 12:27:58.622513 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7z2\" (UniqueName: \"kubernetes.io/projected/a64bde84-ef1a-4c46-bd83-ea307dfd5430-kube-api-access-7b7z2\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.622704 master-0 kubenswrapper[29612]: I0319 12:27:58.622596 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.622704 master-0 kubenswrapper[29612]: I0319 12:27:58.622683 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.622802 master-0 kubenswrapper[29612]: I0319 12:27:58.622779 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.622857 master-0 kubenswrapper[29612]: I0319 12:27:58.622825 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.622975 master-0 kubenswrapper[29612]: I0319 12:27:58.622885 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.623020 master-0 kubenswrapper[29612]: I0319 12:27:58.622976 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.623962 master-0 kubenswrapper[29612]: I0319 12:27:58.623927 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.625560 master-0 kubenswrapper[29612]: I0319 12:27:58.625514 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.630215 master-0 kubenswrapper[29612]: I0319 12:27:58.630184 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.632768 master-0 kubenswrapper[29612]: I0319 12:27:58.631665 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.632768 master-0 kubenswrapper[29612]: I0319 12:27:58.632343 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.639911 master-0 kubenswrapper[29612]: I0319 12:27:58.639867 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.701434 master-0 kubenswrapper[29612]: I0319 12:27:58.701379 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849f55d89-wzvhq"] Mar 19 12:27:58.921947 master-0 kubenswrapper[29612]: I0319 12:27:58.921886 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc878c70-3d43-4704-a687-4747aeec1855" path="/var/lib/kubelet/pods/fc878c70-3d43-4704-a687-4747aeec1855/volumes" Mar 19 12:27:58.929199 master-0 kubenswrapper[29612]: I0319 12:27:58.929129 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:58.931120 master-0 kubenswrapper[29612]: I0319 12:27:58.931078 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:27:58.931185 master-0 kubenswrapper[29612]: I0319 12:27:58.931125 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/9a90be70eddb52d75ac120c1599a27499c5565c88a66c71224351799b28411da/globalmount\"" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:59.108333 master-0 kubenswrapper[29612]: I0319 12:27:59.108275 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h9cqc" event={"ID":"3a976515-4dc5-45d4-b047-92e92de29df6","Type":"ContainerStarted","Data":"272134e6b14534039e98f0ebc3d4621869c16a5e9d33dc7bcdb9de16c5f126c5"} Mar 19 12:27:59.109697 master-0 kubenswrapper[29612]: I0319 12:27:59.109663 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" event={"ID":"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a","Type":"ContainerStarted","Data":"9852272d8fe63e88d3ded453ef2199f16f18365d2b1cb29e3121468b1efbdb7a"} Mar 19 12:27:59.111828 master-0 kubenswrapper[29612]: I0319 12:27:59.111791 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" event={"ID":"3e6837c4-0119-40f6-83e9-f0ea203b3781","Type":"ContainerStarted","Data":"373dc90147542c20aef9378ef741eb424b0a7ad4938cf7524b61faac13d6d182"} Mar 19 12:27:59.111923 master-0 kubenswrapper[29612]: I0319 12:27:59.111823 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" event={"ID":"3e6837c4-0119-40f6-83e9-f0ea203b3781","Type":"ContainerStarted","Data":"1ee64c9948c6ea514f5f7f29f560d486e3c1022dc074823ffc86df8100221a78"} Mar 19 12:27:59.113726 master-0 kubenswrapper[29612]: I0319 12:27:59.113699 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-6mnv4" event={"ID":"e893922b-55bb-4c15-84ed-4a005cffe0d2","Type":"ContainerStarted","Data":"53a8fa5477f3887d7fac256a3a45376952c72677c063ab4bc49ccea022f5cb99"} Mar 19 12:27:59.113813 master-0 kubenswrapper[29612]: I0319 12:27:59.113729 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-6mnv4" event={"ID":"e893922b-55bb-4c15-84ed-4a005cffe0d2","Type":"ContainerStarted","Data":"af0be09d7cccb7717fc5c65dc6ec5f5c8a358d84e3805e9be6b8458be941fc5c"} Mar 19 12:27:59.115826 master-0 kubenswrapper[29612]: I0319 12:27:59.115731 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-8ba2-account-create-update-24xjd" event={"ID":"27627ab5-718d-4b5c-ba01-9e7c7da19223","Type":"ContainerStarted","Data":"ad85012193de3e4c5962806b908621481620c36f819c9ddba6221ca17ea13d7e"} Mar 19 12:27:59.116012 master-0 kubenswrapper[29612]: I0319 12:27:59.115993 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-8ba2-account-create-update-24xjd" event={"ID":"27627ab5-718d-4b5c-ba01-9e7c7da19223","Type":"ContainerStarted","Data":"14ee23c273bf6f457e52d497e80aff997500c29f516429aa4e26cc92c4be0235"} Mar 19 12:27:59.117793 master-0 kubenswrapper[29612]: I0319 12:27:59.117737 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mh455" event={"ID":"1d744992-9162-4633-9614-ed0c4ac81988","Type":"ContainerStarted","Data":"79cff36b6685fae615be3f2775f02a6c1d208b32ddea60b555a2d59e2b70797b"} Mar 19 12:27:59.117793 master-0 kubenswrapper[29612]: I0319 12:27:59.117768 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mh455" event={"ID":"1d744992-9162-4633-9614-ed0c4ac81988","Type":"ContainerStarted","Data":"74d4863262f99703a760e4d820baf31985f4ecfe76f447b2b6cf25047c4a24d1"} Mar 19 12:27:59.121628 master-0 kubenswrapper[29612]: I0319 12:27:59.121581 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-db-sync-dx9n4" event={"ID":"9f8cc63c-1080-41d8-b789-bce87f57e603","Type":"ContainerStarted","Data":"2076c8a27cd33b2af478f1703569fb4e289f59855949358bc7ce5083ac7c457f"} Mar 19 12:27:59.135140 master-0 kubenswrapper[29612]: I0319 12:27:59.135103 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7z2\" (UniqueName: \"kubernetes.io/projected/a64bde84-ef1a-4c46-bd83-ea307dfd5430-kube-api-access-7b7z2\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:27:59.423524 master-0 kubenswrapper[29612]: I0319 12:27:59.423433 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mh8mj" podStartSLOduration=6.423416796 podStartE2EDuration="6.423416796s" podCreationTimestamp="2026-03-19 12:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:27:59.421881103 +0000 UTC m=+1086.736050124" watchObservedRunningTime="2026-03-19 12:27:59.423416796 +0000 UTC m=+1086.737585817" Mar 19 12:27:59.625356 master-0 kubenswrapper[29612]: I0319 12:27:59.625010 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-849f55d89-wzvhq" podUID="fc878c70-3d43-4704-a687-4747aeec1855" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.203:5353: i/o timeout" Mar 19 12:28:00.141114 master-0 kubenswrapper[29612]: I0319 12:28:00.140981 29612 generic.go:334] "Generic (PLEG): container finished" podID="e893922b-55bb-4c15-84ed-4a005cffe0d2" containerID="53a8fa5477f3887d7fac256a3a45376952c72677c063ab4bc49ccea022f5cb99" exitCode=0 Mar 19 12:28:00.141334 master-0 kubenswrapper[29612]: I0319 12:28:00.141154 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-6mnv4" event={"ID":"e893922b-55bb-4c15-84ed-4a005cffe0d2","Type":"ContainerDied","Data":"53a8fa5477f3887d7fac256a3a45376952c72677c063ab4bc49ccea022f5cb99"} Mar 19 12:28:00.155584 master-0 kubenswrapper[29612]: I0319 12:28:00.145901 29612 generic.go:334] "Generic (PLEG): container finished" podID="27627ab5-718d-4b5c-ba01-9e7c7da19223" containerID="ad85012193de3e4c5962806b908621481620c36f819c9ddba6221ca17ea13d7e" exitCode=0 Mar 19 12:28:00.155584 master-0 kubenswrapper[29612]: I0319 12:28:00.146253 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-8ba2-account-create-update-24xjd" event={"ID":"27627ab5-718d-4b5c-ba01-9e7c7da19223","Type":"ContainerDied","Data":"ad85012193de3e4c5962806b908621481620c36f819c9ddba6221ca17ea13d7e"} Mar 19 12:28:00.155584 master-0 kubenswrapper[29612]: I0319 12:28:00.148762 29612 generic.go:334] "Generic (PLEG): container finished" podID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerID="8e3834bd61a4bf9c10c929f982dcf0025334f699de6ca2c52ffe419a33fe9a02" exitCode=0 Mar 19 12:28:00.155584 master-0 kubenswrapper[29612]: I0319 12:28:00.148874 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" event={"ID":"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a","Type":"ContainerDied","Data":"8e3834bd61a4bf9c10c929f982dcf0025334f699de6ca2c52ffe419a33fe9a02"} Mar 19 12:28:00.155584 master-0 kubenswrapper[29612]: I0319 12:28:00.152907 29612 generic.go:334] "Generic (PLEG): container finished" podID="3e6837c4-0119-40f6-83e9-f0ea203b3781" containerID="373dc90147542c20aef9378ef741eb424b0a7ad4938cf7524b61faac13d6d182" exitCode=0 Mar 19 12:28:00.155584 master-0 kubenswrapper[29612]: I0319 12:28:00.154248 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" event={"ID":"3e6837c4-0119-40f6-83e9-f0ea203b3781","Type":"ContainerDied","Data":"373dc90147542c20aef9378ef741eb424b0a7ad4938cf7524b61faac13d6d182"} Mar 19 12:28:00.809978 master-0 kubenswrapper[29612]: I0319 12:28:00.802298 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:00.844910 master-0 kubenswrapper[29612]: I0319 12:28:00.843887 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:28:00.871148 master-0 kubenswrapper[29612]: I0319 12:28:00.871079 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-config\") pod \"3e6837c4-0119-40f6-83e9-f0ea203b3781\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " Mar 19 12:28:00.871380 master-0 kubenswrapper[29612]: I0319 12:28:00.871348 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-swift-storage-0\") pod \"3e6837c4-0119-40f6-83e9-f0ea203b3781\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " Mar 19 12:28:00.871443 master-0 kubenswrapper[29612]: I0319 12:28:00.871419 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-sb\") pod \"3e6837c4-0119-40f6-83e9-f0ea203b3781\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " Mar 19 12:28:00.871493 master-0 kubenswrapper[29612]: I0319 12:28:00.871479 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-svc\") pod \"3e6837c4-0119-40f6-83e9-f0ea203b3781\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " Mar 19 12:28:00.871664 master-0 kubenswrapper[29612]: I0319 12:28:00.871619 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2p2r\" (UniqueName: \"kubernetes.io/projected/3e6837c4-0119-40f6-83e9-f0ea203b3781-kube-api-access-l2p2r\") pod \"3e6837c4-0119-40f6-83e9-f0ea203b3781\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " Mar 19 12:28:00.871732 master-0 kubenswrapper[29612]: I0319 12:28:00.871681 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-nb\") pod \"3e6837c4-0119-40f6-83e9-f0ea203b3781\" (UID: \"3e6837c4-0119-40f6-83e9-f0ea203b3781\") " Mar 19 12:28:00.899566 master-0 kubenswrapper[29612]: I0319 12:28:00.899466 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6837c4-0119-40f6-83e9-f0ea203b3781-kube-api-access-l2p2r" (OuterVolumeSpecName: "kube-api-access-l2p2r") pod "3e6837c4-0119-40f6-83e9-f0ea203b3781" (UID: "3e6837c4-0119-40f6-83e9-f0ea203b3781"). InnerVolumeSpecName "kube-api-access-l2p2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:00.901291 master-0 kubenswrapper[29612]: I0319 12:28:00.900294 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e6837c4-0119-40f6-83e9-f0ea203b3781" (UID: "3e6837c4-0119-40f6-83e9-f0ea203b3781"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:00.901291 master-0 kubenswrapper[29612]: I0319 12:28:00.900915 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3e6837c4-0119-40f6-83e9-f0ea203b3781" (UID: "3e6837c4-0119-40f6-83e9-f0ea203b3781"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:00.902089 master-0 kubenswrapper[29612]: I0319 12:28:00.902031 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3e6837c4-0119-40f6-83e9-f0ea203b3781" (UID: "3e6837c4-0119-40f6-83e9-f0ea203b3781"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:00.904432 master-0 kubenswrapper[29612]: I0319 12:28:00.904380 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-config" (OuterVolumeSpecName: "config") pod "3e6837c4-0119-40f6-83e9-f0ea203b3781" (UID: "3e6837c4-0119-40f6-83e9-f0ea203b3781"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:00.905599 master-0 kubenswrapper[29612]: I0319 12:28:00.905558 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e6837c4-0119-40f6-83e9-f0ea203b3781" (UID: "3e6837c4-0119-40f6-83e9-f0ea203b3781"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:00.975616 master-0 kubenswrapper[29612]: I0319 12:28:00.975205 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2p2r\" (UniqueName: \"kubernetes.io/projected/3e6837c4-0119-40f6-83e9-f0ea203b3781-kube-api-access-l2p2r\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:00.975616 master-0 kubenswrapper[29612]: I0319 12:28:00.975260 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:00.975616 master-0 kubenswrapper[29612]: I0319 12:28:00.975276 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:00.975616 master-0 kubenswrapper[29612]: I0319 12:28:00.975292 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:00.975616 master-0 kubenswrapper[29612]: I0319 12:28:00.975305 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:00.975616 master-0 kubenswrapper[29612]: I0319 12:28:00.975320 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e6837c4-0119-40f6-83e9-f0ea203b3781-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:00.978268 master-0 kubenswrapper[29612]: I0319 12:28:00.977818 29612 trace.go:236] Trace[372453201]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (19-Mar-2026 12:27:59.615) (total time: 1362ms): Mar 19 12:28:00.978268 master-0 kubenswrapper[29612]: Trace[372453201]: [1.362170948s] [1.362170948s] END Mar 19 12:28:01.167826 master-0 kubenswrapper[29612]: I0319 12:28:01.167774 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" Mar 19 12:28:01.169319 master-0 kubenswrapper[29612]: I0319 12:28:01.169287 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fcdff467-6j2bl" event={"ID":"3e6837c4-0119-40f6-83e9-f0ea203b3781","Type":"ContainerDied","Data":"1ee64c9948c6ea514f5f7f29f560d486e3c1022dc074823ffc86df8100221a78"} Mar 19 12:28:01.169552 master-0 kubenswrapper[29612]: I0319 12:28:01.169337 29612 scope.go:117] "RemoveContainer" containerID="373dc90147542c20aef9378ef741eb424b0a7ad4938cf7524b61faac13d6d182" Mar 19 12:28:01.817762 master-0 kubenswrapper[29612]: I0319 12:28:01.817697 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:28:01.825890 master-0 kubenswrapper[29612]: I0319 12:28:01.825830 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-6mnv4" Mar 19 12:28:02.185094 master-0 kubenswrapper[29612]: I0319 12:28:02.185033 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-6mnv4" Mar 19 12:28:02.185336 master-0 kubenswrapper[29612]: I0319 12:28:02.185035 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-6mnv4" event={"ID":"e893922b-55bb-4c15-84ed-4a005cffe0d2","Type":"ContainerDied","Data":"af0be09d7cccb7717fc5c65dc6ec5f5c8a358d84e3805e9be6b8458be941fc5c"} Mar 19 12:28:02.185336 master-0 kubenswrapper[29612]: I0319 12:28:02.185231 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af0be09d7cccb7717fc5c65dc6ec5f5c8a358d84e3805e9be6b8458be941fc5c" Mar 19 12:28:02.187151 master-0 kubenswrapper[29612]: I0319 12:28:02.187126 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-8ba2-account-create-update-24xjd" Mar 19 12:28:02.187238 master-0 kubenswrapper[29612]: I0319 12:28:02.187167 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-8ba2-account-create-update-24xjd" event={"ID":"27627ab5-718d-4b5c-ba01-9e7c7da19223","Type":"ContainerDied","Data":"14ee23c273bf6f457e52d497e80aff997500c29f516429aa4e26cc92c4be0235"} Mar 19 12:28:02.187238 master-0 kubenswrapper[29612]: I0319 12:28:02.187221 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ee23c273bf6f457e52d497e80aff997500c29f516429aa4e26cc92c4be0235" Mar 19 12:28:02.189256 master-0 kubenswrapper[29612]: I0319 12:28:02.189210 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" event={"ID":"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a","Type":"ContainerStarted","Data":"4740809d17c3266968f5aeea0d8e57ca84ea0b54a6e25301d1bab684a7c607e5"} Mar 19 12:28:03.008489 master-0 kubenswrapper[29612]: E0319 12:28:03.008412 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97212ea2_7247_45e9_89e9_0bcf6d1acdf1.slice/crio-conmon-621df10343b235aa08e0998d8695d227dc1134913b8bac93661c221c6841cd76.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:28:03.206611 master-0 kubenswrapper[29612]: I0319 12:28:03.206562 29612 generic.go:334] "Generic (PLEG): container finished" podID="97212ea2-7247-45e9-89e9-0bcf6d1acdf1" containerID="621df10343b235aa08e0998d8695d227dc1134913b8bac93661c221c6841cd76" exitCode=0 Mar 19 12:28:03.207650 master-0 kubenswrapper[29612]: I0319 12:28:03.206887 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mh8mj" event={"ID":"97212ea2-7247-45e9-89e9-0bcf6d1acdf1","Type":"ContainerDied","Data":"621df10343b235aa08e0998d8695d227dc1134913b8bac93661c221c6841cd76"} Mar 19 12:28:03.209128 master-0 kubenswrapper[29612]: I0319 12:28:03.208080 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:28:03.225626 master-0 kubenswrapper[29612]: I0319 12:28:03.225135 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:03.685668 master-0 kubenswrapper[29612]: I0319 12:28:03.684823 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e893922b-55bb-4c15-84ed-4a005cffe0d2-operator-scripts\") pod \"e893922b-55bb-4c15-84ed-4a005cffe0d2\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " Mar 19 12:28:03.685668 master-0 kubenswrapper[29612]: I0319 12:28:03.684955 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27627ab5-718d-4b5c-ba01-9e7c7da19223-operator-scripts\") pod \"27627ab5-718d-4b5c-ba01-9e7c7da19223\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " Mar 19 12:28:03.685668 master-0 kubenswrapper[29612]: I0319 12:28:03.685045 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkxb8\" (UniqueName: \"kubernetes.io/projected/27627ab5-718d-4b5c-ba01-9e7c7da19223-kube-api-access-gkxb8\") pod \"27627ab5-718d-4b5c-ba01-9e7c7da19223\" (UID: \"27627ab5-718d-4b5c-ba01-9e7c7da19223\") " Mar 19 12:28:03.685668 master-0 kubenswrapper[29612]: I0319 12:28:03.685075 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdql5\" (UniqueName: \"kubernetes.io/projected/e893922b-55bb-4c15-84ed-4a005cffe0d2-kube-api-access-mdql5\") pod \"e893922b-55bb-4c15-84ed-4a005cffe0d2\" (UID: \"e893922b-55bb-4c15-84ed-4a005cffe0d2\") " Mar 19 12:28:03.686199 master-0 kubenswrapper[29612]: I0319 12:28:03.685708 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e893922b-55bb-4c15-84ed-4a005cffe0d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e893922b-55bb-4c15-84ed-4a005cffe0d2" (UID: "e893922b-55bb-4c15-84ed-4a005cffe0d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:03.686544 master-0 kubenswrapper[29612]: I0319 12:28:03.686462 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27627ab5-718d-4b5c-ba01-9e7c7da19223-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27627ab5-718d-4b5c-ba01-9e7c7da19223" (UID: "27627ab5-718d-4b5c-ba01-9e7c7da19223"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:03.688124 master-0 kubenswrapper[29612]: I0319 12:28:03.688040 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e893922b-55bb-4c15-84ed-4a005cffe0d2-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:03.691433 master-0 kubenswrapper[29612]: I0319 12:28:03.691375 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27627ab5-718d-4b5c-ba01-9e7c7da19223-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:03.691797 master-0 kubenswrapper[29612]: I0319 12:28:03.691760 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e893922b-55bb-4c15-84ed-4a005cffe0d2-kube-api-access-mdql5" (OuterVolumeSpecName: "kube-api-access-mdql5") pod "e893922b-55bb-4c15-84ed-4a005cffe0d2" (UID: "e893922b-55bb-4c15-84ed-4a005cffe0d2"). InnerVolumeSpecName "kube-api-access-mdql5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:03.692144 master-0 kubenswrapper[29612]: I0319 12:28:03.692096 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27627ab5-718d-4b5c-ba01-9e7c7da19223-kube-api-access-gkxb8" (OuterVolumeSpecName: "kube-api-access-gkxb8") pod "27627ab5-718d-4b5c-ba01-9e7c7da19223" (UID: "27627ab5-718d-4b5c-ba01-9e7c7da19223"). InnerVolumeSpecName "kube-api-access-gkxb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:03.793768 master-0 kubenswrapper[29612]: I0319 12:28:03.793688 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkxb8\" (UniqueName: \"kubernetes.io/projected/27627ab5-718d-4b5c-ba01-9e7c7da19223-kube-api-access-gkxb8\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:03.793768 master-0 kubenswrapper[29612]: I0319 12:28:03.793751 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdql5\" (UniqueName: \"kubernetes.io/projected/e893922b-55bb-4c15-84ed-4a005cffe0d2-kube-api-access-mdql5\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:03.933982 master-0 kubenswrapper[29612]: I0319 12:28:03.933873 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mh455" podStartSLOduration=10.933846706 podStartE2EDuration="10.933846706s" podCreationTimestamp="2026-03-19 12:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:03.913291432 +0000 UTC m=+1091.227460483" watchObservedRunningTime="2026-03-19 12:28:03.933846706 +0000 UTC m=+1091.248015757" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.285143 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: E0319 12:28:04.285873 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e893922b-55bb-4c15-84ed-4a005cffe0d2" containerName="mariadb-database-create" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.285893 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="e893922b-55bb-4c15-84ed-4a005cffe0d2" containerName="mariadb-database-create" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: E0319 12:28:04.285958 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27627ab5-718d-4b5c-ba01-9e7c7da19223" containerName="mariadb-account-create-update" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.285969 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="27627ab5-718d-4b5c-ba01-9e7c7da19223" containerName="mariadb-account-create-update" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: E0319 12:28:04.286020 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6837c4-0119-40f6-83e9-f0ea203b3781" containerName="init" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.286031 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6837c4-0119-40f6-83e9-f0ea203b3781" containerName="init" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.286345 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6837c4-0119-40f6-83e9-f0ea203b3781" containerName="init" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.286374 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="27627ab5-718d-4b5c-ba01-9e7c7da19223" containerName="mariadb-account-create-update" Mar 19 12:28:04.287046 master-0 kubenswrapper[29612]: I0319 12:28:04.286440 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="e893922b-55bb-4c15-84ed-4a005cffe0d2" containerName="mariadb-database-create" Mar 19 12:28:04.287807 master-0 kubenswrapper[29612]: I0319 12:28:04.287747 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.290336 master-0 kubenswrapper[29612]: I0319 12:28:04.290305 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-default-internal-config-data" Mar 19 12:28:04.290909 master-0 kubenswrapper[29612]: I0319 12:28:04.290893 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 12:28:04.433575 master-0 kubenswrapper[29612]: I0319 12:28:04.433496 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:04.682938 master-0 kubenswrapper[29612]: I0319 12:28:04.681507 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fcdff467-6j2bl"] Mar 19 12:28:04.751734 master-0 kubenswrapper[29612]: I0319 12:28:04.751669 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59fcdff467-6j2bl"] Mar 19 12:28:04.918554 master-0 kubenswrapper[29612]: I0319 12:28:04.918487 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6837c4-0119-40f6-83e9-f0ea203b3781" path="/var/lib/kubelet/pods/3e6837c4-0119-40f6-83e9-f0ea203b3781/volumes" Mar 19 12:28:04.966715 master-0 kubenswrapper[29612]: I0319 12:28:04.966536 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfjn\" (UniqueName: \"kubernetes.io/projected/44851259-b61b-4e6a-a228-adc8c9a78f4e-kube-api-access-vhfjn\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.966715 master-0 kubenswrapper[29612]: I0319 12:28:04.966700 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.966959 master-0 kubenswrapper[29612]: I0319 12:28:04.966892 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.966995 master-0 kubenswrapper[29612]: I0319 12:28:04.966964 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.967043 master-0 kubenswrapper[29612]: I0319 12:28:04.967009 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.967043 master-0 kubenswrapper[29612]: I0319 12:28:04.967036 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.967392 master-0 kubenswrapper[29612]: I0319 12:28:04.967192 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:04.967392 master-0 kubenswrapper[29612]: I0319 12:28:04.967258 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069491 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069563 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069588 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069605 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069782 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069831 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfjn\" (UniqueName: \"kubernetes.io/projected/44851259-b61b-4e6a-a228-adc8c9a78f4e-kube-api-access-vhfjn\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.069928 master-0 kubenswrapper[29612]: I0319 12:28:05.069890 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.078764 master-0 kubenswrapper[29612]: I0319 12:28:05.078713 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.079097 master-0 kubenswrapper[29612]: I0319 12:28:05.079070 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.080153 master-0 kubenswrapper[29612]: I0319 12:28:05.080124 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.080646 master-0 kubenswrapper[29612]: I0319 12:28:05.080601 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.090610 master-0 kubenswrapper[29612]: I0319 12:28:05.090560 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.093445 master-0 kubenswrapper[29612]: I0319 12:28:05.093394 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.172061 master-0 kubenswrapper[29612]: I0319 12:28:05.171982 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.174011 master-0 kubenswrapper[29612]: I0319 12:28:05.173970 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:28:05.174124 master-0 kubenswrapper[29612]: I0319 12:28:05.174049 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d60c87c719cf4283aa56003c1e019d419c362ddaab3f9dd90c8b6d2b36b7086b/globalmount\"" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.276545 master-0 kubenswrapper[29612]: I0319 12:28:05.276484 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfjn\" (UniqueName: \"kubernetes.io/projected/44851259-b61b-4e6a-a228-adc8c9a78f4e-kube-api-access-vhfjn\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:05.277064 master-0 kubenswrapper[29612]: I0319 12:28:05.277038 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:06.291204 master-0 kubenswrapper[29612]: W0319 12:28:06.291154 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64bde84_ef1a_4c46_bd83_ea307dfd5430.slice/crio-70e209647f7551d6272a3ac30e0859347eebcc99fa942f6aad342ec1b6efefa2 WatchSource:0}: Error finding container 70e209647f7551d6272a3ac30e0859347eebcc99fa942f6aad342ec1b6efefa2: Status 404 returned error can't find the container with id 70e209647f7551d6272a3ac30e0859347eebcc99fa942f6aad342ec1b6efefa2 Mar 19 12:28:06.352452 master-0 kubenswrapper[29612]: I0319 12:28:06.351680 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:06.352452 master-0 kubenswrapper[29612]: E0319 12:28:06.352404 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-3abc0-default-internal-api-0" podUID="44851259-b61b-4e6a-a228-adc8c9a78f4e" Mar 19 12:28:06.385189 master-0 kubenswrapper[29612]: I0319 12:28:06.385124 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:28:06.499225 master-0 kubenswrapper[29612]: I0319 12:28:06.499172 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-combined-ca-bundle\") pod \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " Mar 19 12:28:06.499225 master-0 kubenswrapper[29612]: I0319 12:28:06.499243 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-config-data\") pod \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " Mar 19 12:28:06.499225 master-0 kubenswrapper[29612]: I0319 12:28:06.499267 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-credential-keys\") pod \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " Mar 19 12:28:06.499531 master-0 kubenswrapper[29612]: I0319 12:28:06.499311 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmzh8\" (UniqueName: \"kubernetes.io/projected/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-kube-api-access-fmzh8\") pod \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " Mar 19 12:28:06.499531 master-0 kubenswrapper[29612]: I0319 12:28:06.499386 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-scripts\") pod \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " Mar 19 12:28:06.499531 master-0 kubenswrapper[29612]: I0319 12:28:06.499484 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-fernet-keys\") pod \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\" (UID: \"97212ea2-7247-45e9-89e9-0bcf6d1acdf1\") " Mar 19 12:28:06.502847 master-0 kubenswrapper[29612]: I0319 12:28:06.502802 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "97212ea2-7247-45e9-89e9-0bcf6d1acdf1" (UID: "97212ea2-7247-45e9-89e9-0bcf6d1acdf1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:06.503469 master-0 kubenswrapper[29612]: I0319 12:28:06.503418 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "97212ea2-7247-45e9-89e9-0bcf6d1acdf1" (UID: "97212ea2-7247-45e9-89e9-0bcf6d1acdf1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:06.504313 master-0 kubenswrapper[29612]: I0319 12:28:06.503954 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-scripts" (OuterVolumeSpecName: "scripts") pod "97212ea2-7247-45e9-89e9-0bcf6d1acdf1" (UID: "97212ea2-7247-45e9-89e9-0bcf6d1acdf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:06.505603 master-0 kubenswrapper[29612]: I0319 12:28:06.505526 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-kube-api-access-fmzh8" (OuterVolumeSpecName: "kube-api-access-fmzh8") pod "97212ea2-7247-45e9-89e9-0bcf6d1acdf1" (UID: "97212ea2-7247-45e9-89e9-0bcf6d1acdf1"). InnerVolumeSpecName "kube-api-access-fmzh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:06.530607 master-0 kubenswrapper[29612]: I0319 12:28:06.530547 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97212ea2-7247-45e9-89e9-0bcf6d1acdf1" (UID: "97212ea2-7247-45e9-89e9-0bcf6d1acdf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:06.533231 master-0 kubenswrapper[29612]: I0319 12:28:06.533161 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-config-data" (OuterVolumeSpecName: "config-data") pod "97212ea2-7247-45e9-89e9-0bcf6d1acdf1" (UID: "97212ea2-7247-45e9-89e9-0bcf6d1acdf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:06.602267 master-0 kubenswrapper[29612]: I0319 12:28:06.601956 29612 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:06.602267 master-0 kubenswrapper[29612]: I0319 12:28:06.601987 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:06.602267 master-0 kubenswrapper[29612]: I0319 12:28:06.601996 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:06.602267 master-0 kubenswrapper[29612]: I0319 12:28:06.602004 29612 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:06.602267 master-0 kubenswrapper[29612]: I0319 12:28:06.602013 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmzh8\" (UniqueName: \"kubernetes.io/projected/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-kube-api-access-fmzh8\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:06.602267 master-0 kubenswrapper[29612]: I0319 12:28:06.602025 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97212ea2-7247-45e9-89e9-0bcf6d1acdf1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:06.879848 master-0 kubenswrapper[29612]: I0319 12:28:06.879679 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" podStartSLOduration=12.879659347 podStartE2EDuration="12.879659347s" podCreationTimestamp="2026-03-19 12:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:06.859038511 +0000 UTC m=+1094.173207542" watchObservedRunningTime="2026-03-19 12:28:06.879659347 +0000 UTC m=+1094.193828368" Mar 19 12:28:07.086669 master-0 kubenswrapper[29612]: I0319 12:28:07.086587 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:07.268898 master-0 kubenswrapper[29612]: I0319 12:28:07.268815 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mh8mj" event={"ID":"97212ea2-7247-45e9-89e9-0bcf6d1acdf1","Type":"ContainerDied","Data":"15c79f485ec4d90fd2ad4e8170edab01899f2ac9c9fd3d2a49b29b9a1e1505ea"} Mar 19 12:28:07.268898 master-0 kubenswrapper[29612]: I0319 12:28:07.268875 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15c79f485ec4d90fd2ad4e8170edab01899f2ac9c9fd3d2a49b29b9a1e1505ea" Mar 19 12:28:07.269211 master-0 kubenswrapper[29612]: I0319 12:28:07.268913 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mh8mj" Mar 19 12:28:07.272048 master-0 kubenswrapper[29612]: I0319 12:28:07.271667 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:07.272589 master-0 kubenswrapper[29612]: I0319 12:28:07.272558 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"a64bde84-ef1a-4c46-bd83-ea307dfd5430","Type":"ContainerStarted","Data":"f2ba96dbc94db110e57d4391401bed6845ddb34564b25c486a6feb5916b44459"} Mar 19 12:28:07.272707 master-0 kubenswrapper[29612]: I0319 12:28:07.272685 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"a64bde84-ef1a-4c46-bd83-ea307dfd5430","Type":"ContainerStarted","Data":"70e209647f7551d6272a3ac30e0859347eebcc99fa942f6aad342ec1b6efefa2"} Mar 19 12:28:07.285265 master-0 kubenswrapper[29612]: I0319 12:28:07.285222 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.432987 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-logs\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433132 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfjn\" (UniqueName: \"kubernetes.io/projected/44851259-b61b-4e6a-a228-adc8c9a78f4e-kube-api-access-vhfjn\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433216 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-internal-tls-certs\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433257 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-combined-ca-bundle\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433309 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-scripts\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433348 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-httpd-run\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433380 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-config-data\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.434293 master-0 kubenswrapper[29612]: I0319 12:28:07.433539 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"44851259-b61b-4e6a-a228-adc8c9a78f4e\" (UID: \"44851259-b61b-4e6a-a228-adc8c9a78f4e\") " Mar 19 12:28:07.438213 master-0 kubenswrapper[29612]: I0319 12:28:07.437592 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:07.444837 master-0 kubenswrapper[29612]: I0319 12:28:07.444750 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-logs" (OuterVolumeSpecName: "logs") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:07.449287 master-0 kubenswrapper[29612]: I0319 12:28:07.449177 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-scripts" (OuterVolumeSpecName: "scripts") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:07.457357 master-0 kubenswrapper[29612]: I0319 12:28:07.457266 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:07.457547 master-0 kubenswrapper[29612]: I0319 12:28:07.457398 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:07.459981 master-0 kubenswrapper[29612]: I0319 12:28:07.457583 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-config-data" (OuterVolumeSpecName: "config-data") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:07.463915 master-0 kubenswrapper[29612]: I0319 12:28:07.463845 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44851259-b61b-4e6a-a228-adc8c9a78f4e-kube-api-access-vhfjn" (OuterVolumeSpecName: "kube-api-access-vhfjn") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "kube-api-access-vhfjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:07.468830 master-0 kubenswrapper[29612]: I0319 12:28:07.468743 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319" (OuterVolumeSpecName: "glance") pod "44851259-b61b-4e6a-a228-adc8c9a78f4e" (UID: "44851259-b61b-4e6a-a228-adc8c9a78f4e"). InnerVolumeSpecName "pvc-7c8db109-1510-46ed-a028-d218e99693f0". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 12:28:07.539248 master-0 kubenswrapper[29612]: I0319 12:28:07.539174 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfjn\" (UniqueName: \"kubernetes.io/projected/44851259-b61b-4e6a-a228-adc8c9a78f4e-kube-api-access-vhfjn\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.539248 master-0 kubenswrapper[29612]: I0319 12:28:07.539222 29612 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.539248 master-0 kubenswrapper[29612]: I0319 12:28:07.539234 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.539248 master-0 kubenswrapper[29612]: I0319 12:28:07.539244 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.539248 master-0 kubenswrapper[29612]: I0319 12:28:07.539253 29612 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.539248 master-0 kubenswrapper[29612]: I0319 12:28:07.539262 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44851259-b61b-4e6a-a228-adc8c9a78f4e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.539836 master-0 kubenswrapper[29612]: I0319 12:28:07.539296 29612 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") on node \"master-0\" " Mar 19 12:28:07.539836 master-0 kubenswrapper[29612]: I0319 12:28:07.539307 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44851259-b61b-4e6a-a228-adc8c9a78f4e-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:07.577085 master-0 kubenswrapper[29612]: I0319 12:28:07.576974 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:07.597195 master-0 kubenswrapper[29612]: I0319 12:28:07.597154 29612 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 12:28:07.597831 master-0 kubenswrapper[29612]: I0319 12:28:07.597743 29612 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7c8db109-1510-46ed-a028-d218e99693f0" (UniqueName: "kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319") on node "master-0" Mar 19 12:28:07.641563 master-0 kubenswrapper[29612]: I0319 12:28:07.641474 29612 reconciler_common.go:293] "Volume detached for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:08.282852 master-0 kubenswrapper[29612]: I0319 12:28:08.282729 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:10.279815 master-0 kubenswrapper[29612]: I0319 12:28:10.274992 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:28:12.367068 master-0 kubenswrapper[29612]: I0319 12:28:12.367015 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h9cqc" event={"ID":"3a976515-4dc5-45d4-b047-92e92de29df6","Type":"ContainerStarted","Data":"4131f6cb3c628d16ef964258b0626a8b9a7fb8240018c13b4575dc65b1912ca0"} Mar 19 12:28:12.369452 master-0 kubenswrapper[29612]: I0319 12:28:12.369410 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"a64bde84-ef1a-4c46-bd83-ea307dfd5430","Type":"ContainerStarted","Data":"68365ce8740422547da5c4a14f661eac16d4123f9e69f7afc1e3391339e0449d"} Mar 19 12:28:12.369598 master-0 kubenswrapper[29612]: I0319 12:28:12.369556 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3abc0-default-external-api-0" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-log" containerID="cri-o://f2ba96dbc94db110e57d4391401bed6845ddb34564b25c486a6feb5916b44459" gracePeriod=30 Mar 19 12:28:12.373653 master-0 kubenswrapper[29612]: I0319 12:28:12.369660 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3abc0-default-external-api-0" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-httpd" containerID="cri-o://68365ce8740422547da5c4a14f661eac16d4123f9e69f7afc1e3391339e0449d" gracePeriod=30 Mar 19 12:28:13.273862 master-0 kubenswrapper[29612]: I0319 12:28:13.269200 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:13.369398 master-0 kubenswrapper[29612]: I0319 12:28:13.369308 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:13.404292 master-0 kubenswrapper[29612]: I0319 12:28:13.404204 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mh8mj"] Mar 19 12:28:13.419118 master-0 kubenswrapper[29612]: I0319 12:28:13.418938 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mh8mj"] Mar 19 12:28:13.441496 master-0 kubenswrapper[29612]: I0319 12:28:13.441426 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:13.443085 master-0 kubenswrapper[29612]: E0319 12:28:13.443057 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97212ea2-7247-45e9-89e9-0bcf6d1acdf1" containerName="keystone-bootstrap" Mar 19 12:28:13.443235 master-0 kubenswrapper[29612]: I0319 12:28:13.443219 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="97212ea2-7247-45e9-89e9-0bcf6d1acdf1" containerName="keystone-bootstrap" Mar 19 12:28:13.443668 master-0 kubenswrapper[29612]: I0319 12:28:13.443614 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="97212ea2-7247-45e9-89e9-0bcf6d1acdf1" containerName="keystone-bootstrap" Mar 19 12:28:13.445220 master-0 kubenswrapper[29612]: I0319 12:28:13.445171 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.458401 master-0 kubenswrapper[29612]: I0319 12:28:13.458343 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 12:28:13.458920 master-0 kubenswrapper[29612]: I0319 12:28:13.458877 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-default-internal-config-data" Mar 19 12:28:13.499987 master-0 kubenswrapper[29612]: I0319 12:28:13.499879 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:13.522562 master-0 kubenswrapper[29612]: I0319 12:28:13.521483 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8lnqx"] Mar 19 12:28:13.523017 master-0 kubenswrapper[29612]: I0319 12:28:13.522976 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.532933 master-0 kubenswrapper[29612]: I0319 12:28:13.532872 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 12:28:13.533123 master-0 kubenswrapper[29612]: I0319 12:28:13.533091 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 12:28:13.535603 master-0 kubenswrapper[29612]: I0319 12:28:13.535530 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 12:28:13.537992 master-0 kubenswrapper[29612]: I0319 12:28:13.537028 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 12:28:13.571585 master-0 kubenswrapper[29612]: I0319 12:28:13.571510 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8lnqx"] Mar 19 12:28:13.583801 master-0 kubenswrapper[29612]: I0319 12:28:13.583715 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.584039 master-0 kubenswrapper[29612]: I0319 12:28:13.583852 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.584039 master-0 kubenswrapper[29612]: I0319 12:28:13.583924 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpfrh\" (UniqueName: \"kubernetes.io/projected/535fe30f-e49c-4659-a342-978cf8d49d25-kube-api-access-cpfrh\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.584039 master-0 kubenswrapper[29612]: I0319 12:28:13.583976 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.584039 master-0 kubenswrapper[29612]: I0319 12:28:13.584018 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.584162 master-0 kubenswrapper[29612]: I0319 12:28:13.584090 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.584213 master-0 kubenswrapper[29612]: I0319 12:28:13.584177 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.585952 master-0 kubenswrapper[29612]: I0319 12:28:13.585899 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.614564 master-0 kubenswrapper[29612]: I0319 12:28:13.614503 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8649b7f9-5jc6x"] Mar 19 12:28:13.618675 master-0 kubenswrapper[29612]: I0319 12:28:13.614849 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="dnsmasq-dns" containerID="cri-o://81709da4c380d1a7f298cd9c18ef65a4c5bfa7172b54ea6bd8526429aadb1cd5" gracePeriod=10 Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.716982 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717051 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717103 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpfrh\" (UniqueName: \"kubernetes.io/projected/535fe30f-e49c-4659-a342-978cf8d49d25-kube-api-access-cpfrh\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717130 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717162 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717187 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-fernet-keys\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717223 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717253 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gjm\" (UniqueName: \"kubernetes.io/projected/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-kube-api-access-88gjm\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717270 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-config-data\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717295 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717313 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-scripts\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717364 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-credential-keys\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717396 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.717798 master-0 kubenswrapper[29612]: I0319 12:28:13.717430 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-combined-ca-bundle\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.718919 master-0 kubenswrapper[29612]: I0319 12:28:13.718891 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.722599 master-0 kubenswrapper[29612]: I0319 12:28:13.722548 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.723139 master-0 kubenswrapper[29612]: I0319 12:28:13.723099 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.726737 master-0 kubenswrapper[29612]: I0319 12:28:13.726618 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.727838 master-0 kubenswrapper[29612]: I0319 12:28:13.727706 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.728967 master-0 kubenswrapper[29612]: I0319 12:28:13.728930 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:28:13.729033 master-0 kubenswrapper[29612]: I0319 12:28:13.728978 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d60c87c719cf4283aa56003c1e019d419c362ddaab3f9dd90c8b6d2b36b7086b/globalmount\"" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.729690 master-0 kubenswrapper[29612]: I0319 12:28:13.729665 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-w5dht"] Mar 19 12:28:13.731561 master-0 kubenswrapper[29612]: I0319 12:28:13.731302 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.731561 master-0 kubenswrapper[29612]: I0319 12:28:13.731471 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.736293 master-0 kubenswrapper[29612]: I0319 12:28:13.733558 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 19 12:28:13.736293 master-0 kubenswrapper[29612]: I0319 12:28:13.733739 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 19 12:28:13.736293 master-0 kubenswrapper[29612]: I0319 12:28:13.734291 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-h9cqc" podStartSLOduration=7.001336795 podStartE2EDuration="19.734273142s" podCreationTimestamp="2026-03-19 12:27:54 +0000 UTC" firstStartedPulling="2026-03-19 12:27:58.578117475 +0000 UTC m=+1085.892286496" lastFinishedPulling="2026-03-19 12:28:11.311053782 +0000 UTC m=+1098.625222843" observedRunningTime="2026-03-19 12:28:13.692403832 +0000 UTC m=+1101.006572863" watchObservedRunningTime="2026-03-19 12:28:13.734273142 +0000 UTC m=+1101.048442163" Mar 19 12:28:13.766678 master-0 kubenswrapper[29612]: I0319 12:28:13.765295 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-w5dht"] Mar 19 12:28:13.811659 master-0 kubenswrapper[29612]: I0319 12:28:13.811507 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpfrh\" (UniqueName: \"kubernetes.io/projected/535fe30f-e49c-4659-a342-978cf8d49d25-kube-api-access-cpfrh\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:13.819455 master-0 kubenswrapper[29612]: I0319 12:28:13.819400 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6dd57bb3-eafc-41af-a7a0-794cd736adaa-etc-podinfo\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.819455 master-0 kubenswrapper[29612]: I0319 12:28:13.819467 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-combined-ca-bundle\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819511 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-combined-ca-bundle\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819540 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data-merged\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819606 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-fernet-keys\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819655 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819679 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4g76\" (UniqueName: \"kubernetes.io/projected/6dd57bb3-eafc-41af-a7a0-794cd736adaa-kube-api-access-g4g76\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819709 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88gjm\" (UniqueName: \"kubernetes.io/projected/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-kube-api-access-88gjm\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.819749 master-0 kubenswrapper[29612]: I0319 12:28:13.819730 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-config-data\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.820030 master-0 kubenswrapper[29612]: I0319 12:28:13.819758 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-scripts\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.820030 master-0 kubenswrapper[29612]: I0319 12:28:13.819803 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-credential-keys\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.820030 master-0 kubenswrapper[29612]: I0319 12:28:13.819823 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-scripts\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.825762 master-0 kubenswrapper[29612]: I0319 12:28:13.825723 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-combined-ca-bundle\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.828275 master-0 kubenswrapper[29612]: I0319 12:28:13.828203 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-scripts\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.830549 master-0 kubenswrapper[29612]: I0319 12:28:13.830501 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-fernet-keys\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.830963 master-0 kubenswrapper[29612]: I0319 12:28:13.830906 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3abc0-default-external-api-0" podStartSLOduration=20.830873037 podStartE2EDuration="20.830873037s" podCreationTimestamp="2026-03-19 12:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:13.795866462 +0000 UTC m=+1101.110035483" watchObservedRunningTime="2026-03-19 12:28:13.830873037 +0000 UTC m=+1101.145042058" Mar 19 12:28:13.831338 master-0 kubenswrapper[29612]: I0319 12:28:13.831296 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-config-data\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.835296 master-0 kubenswrapper[29612]: I0319 12:28:13.835260 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-credential-keys\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.881449 master-0 kubenswrapper[29612]: I0319 12:28:13.881391 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88gjm\" (UniqueName: \"kubernetes.io/projected/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-kube-api-access-88gjm\") pod \"keystone-bootstrap-8lnqx\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.884379 master-0 kubenswrapper[29612]: I0319 12:28:13.884272 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:13.924348 master-0 kubenswrapper[29612]: I0319 12:28:13.923319 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.924348 master-0 kubenswrapper[29612]: I0319 12:28:13.923397 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4g76\" (UniqueName: \"kubernetes.io/projected/6dd57bb3-eafc-41af-a7a0-794cd736adaa-kube-api-access-g4g76\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.924348 master-0 kubenswrapper[29612]: I0319 12:28:13.923583 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-scripts\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.924348 master-0 kubenswrapper[29612]: I0319 12:28:13.923630 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6dd57bb3-eafc-41af-a7a0-794cd736adaa-etc-podinfo\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.924348 master-0 kubenswrapper[29612]: I0319 12:28:13.923759 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-combined-ca-bundle\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.924348 master-0 kubenswrapper[29612]: I0319 12:28:13.923801 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data-merged\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.925009 master-0 kubenswrapper[29612]: I0319 12:28:13.924983 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data-merged\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.929233 master-0 kubenswrapper[29612]: I0319 12:28:13.929187 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-scripts\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.930673 master-0 kubenswrapper[29612]: I0319 12:28:13.930604 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6dd57bb3-eafc-41af-a7a0-794cd736adaa-etc-podinfo\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.936936 master-0 kubenswrapper[29612]: I0319 12:28:13.936264 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.939843 master-0 kubenswrapper[29612]: I0319 12:28:13.939770 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-combined-ca-bundle\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:13.957075 master-0 kubenswrapper[29612]: I0319 12:28:13.956904 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4g76\" (UniqueName: \"kubernetes.io/projected/6dd57bb3-eafc-41af-a7a0-794cd736adaa-kube-api-access-g4g76\") pod \"ironic-db-sync-w5dht\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:14.013133 master-0 kubenswrapper[29612]: I0319 12:28:14.011330 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:14.952033 master-0 kubenswrapper[29612]: I0319 12:28:14.951953 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44851259-b61b-4e6a-a228-adc8c9a78f4e" path="/var/lib/kubelet/pods/44851259-b61b-4e6a-a228-adc8c9a78f4e/volumes" Mar 19 12:28:14.952947 master-0 kubenswrapper[29612]: I0319 12:28:14.952907 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97212ea2-7247-45e9-89e9-0bcf6d1acdf1" path="/var/lib/kubelet/pods/97212ea2-7247-45e9-89e9-0bcf6d1acdf1/volumes" Mar 19 12:28:15.098351 master-0 kubenswrapper[29612]: I0319 12:28:15.098290 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:15.295584 master-0 kubenswrapper[29612]: I0319 12:28:15.295514 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:15.873202 master-0 kubenswrapper[29612]: I0319 12:28:15.873114 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.183:5353: connect: connection refused" Mar 19 12:28:20.873179 master-0 kubenswrapper[29612]: I0319 12:28:20.873051 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.183:5353: connect: connection refused" Mar 19 12:28:23.511542 master-0 kubenswrapper[29612]: I0319 12:28:23.511467 29612 generic.go:334] "Generic (PLEG): container finished" podID="3a976515-4dc5-45d4-b047-92e92de29df6" containerID="4131f6cb3c628d16ef964258b0626a8b9a7fb8240018c13b4575dc65b1912ca0" exitCode=0 Mar 19 12:28:23.512224 master-0 kubenswrapper[29612]: I0319 12:28:23.511570 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h9cqc" event={"ID":"3a976515-4dc5-45d4-b047-92e92de29df6","Type":"ContainerDied","Data":"4131f6cb3c628d16ef964258b0626a8b9a7fb8240018c13b4575dc65b1912ca0"} Mar 19 12:28:23.514602 master-0 kubenswrapper[29612]: I0319 12:28:23.514133 29612 generic.go:334] "Generic (PLEG): container finished" podID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerID="81709da4c380d1a7f298cd9c18ef65a4c5bfa7172b54ea6bd8526429aadb1cd5" exitCode=0 Mar 19 12:28:23.514602 master-0 kubenswrapper[29612]: I0319 12:28:23.514189 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" event={"ID":"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8","Type":"ContainerDied","Data":"81709da4c380d1a7f298cd9c18ef65a4c5bfa7172b54ea6bd8526429aadb1cd5"} Mar 19 12:28:23.516438 master-0 kubenswrapper[29612]: I0319 12:28:23.516029 29612 generic.go:334] "Generic (PLEG): container finished" podID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerID="68365ce8740422547da5c4a14f661eac16d4123f9e69f7afc1e3391339e0449d" exitCode=0 Mar 19 12:28:23.516438 master-0 kubenswrapper[29612]: I0319 12:28:23.516057 29612 generic.go:334] "Generic (PLEG): container finished" podID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerID="f2ba96dbc94db110e57d4391401bed6845ddb34564b25c486a6feb5916b44459" exitCode=143 Mar 19 12:28:23.516438 master-0 kubenswrapper[29612]: I0319 12:28:23.516076 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"a64bde84-ef1a-4c46-bd83-ea307dfd5430","Type":"ContainerDied","Data":"68365ce8740422547da5c4a14f661eac16d4123f9e69f7afc1e3391339e0449d"} Mar 19 12:28:23.516438 master-0 kubenswrapper[29612]: I0319 12:28:23.516095 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"a64bde84-ef1a-4c46-bd83-ea307dfd5430","Type":"ContainerDied","Data":"f2ba96dbc94db110e57d4391401bed6845ddb34564b25c486a6feb5916b44459"} Mar 19 12:28:24.128998 master-0 kubenswrapper[29612]: I0319 12:28:24.128924 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:28:24.168790 master-0 kubenswrapper[29612]: I0319 12:28:24.168749 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:28:24.340691 master-0 kubenswrapper[29612]: I0319 12:28:24.340624 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-config\") pod \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " Mar 19 12:28:24.340691 master-0 kubenswrapper[29612]: I0319 12:28:24.340709 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-sb\") pod \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " Mar 19 12:28:24.341017 master-0 kubenswrapper[29612]: I0319 12:28:24.340826 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwnph\" (UniqueName: \"kubernetes.io/projected/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-kube-api-access-hwnph\") pod \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " Mar 19 12:28:24.341017 master-0 kubenswrapper[29612]: I0319 12:28:24.340891 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-nb\") pod \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " Mar 19 12:28:24.341017 master-0 kubenswrapper[29612]: I0319 12:28:24.340932 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-dns-svc\") pod \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\" (UID: \"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8\") " Mar 19 12:28:24.358773 master-0 kubenswrapper[29612]: I0319 12:28:24.358711 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-kube-api-access-hwnph" (OuterVolumeSpecName: "kube-api-access-hwnph") pod "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" (UID: "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8"). InnerVolumeSpecName "kube-api-access-hwnph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:24.429920 master-0 kubenswrapper[29612]: I0319 12:28:24.428591 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" (UID: "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:24.445333 master-0 kubenswrapper[29612]: I0319 12:28:24.445294 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwnph\" (UniqueName: \"kubernetes.io/projected/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-kube-api-access-hwnph\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.445333 master-0 kubenswrapper[29612]: I0319 12:28:24.445332 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.450078 master-0 kubenswrapper[29612]: I0319 12:28:24.450031 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-config" (OuterVolumeSpecName: "config") pod "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" (UID: "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:24.454214 master-0 kubenswrapper[29612]: I0319 12:28:24.453847 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" (UID: "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:24.492487 master-0 kubenswrapper[29612]: W0319 12:28:24.492442 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a533ad_81e0_4e73_b4a9_3dcda2fbad28.slice/crio-278453619bf2b805f5897b30e86b9fb756ac65d41e7eaa035f5655c8bab23424 WatchSource:0}: Error finding container 278453619bf2b805f5897b30e86b9fb756ac65d41e7eaa035f5655c8bab23424: Status 404 returned error can't find the container with id 278453619bf2b805f5897b30e86b9fb756ac65d41e7eaa035f5655c8bab23424 Mar 19 12:28:24.499733 master-0 kubenswrapper[29612]: I0319 12:28:24.499679 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" (UID: "2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:24.501376 master-0 kubenswrapper[29612]: I0319 12:28:24.501322 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8lnqx"] Mar 19 12:28:24.506719 master-0 kubenswrapper[29612]: W0319 12:28:24.506681 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dd57bb3_eafc_41af_a7a0_794cd736adaa.slice/crio-34300180a75ac4b39061bad716c65295bada1c35603bd353b1c466b6a01953e0 WatchSource:0}: Error finding container 34300180a75ac4b39061bad716c65295bada1c35603bd353b1c466b6a01953e0: Status 404 returned error can't find the container with id 34300180a75ac4b39061bad716c65295bada1c35603bd353b1c466b6a01953e0 Mar 19 12:28:24.517010 master-0 kubenswrapper[29612]: I0319 12:28:24.516428 29612 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:28:24.528565 master-0 kubenswrapper[29612]: I0319 12:28:24.528520 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-w5dht"] Mar 19 12:28:24.531193 master-0 kubenswrapper[29612]: I0319 12:28:24.531136 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"a64bde84-ef1a-4c46-bd83-ea307dfd5430","Type":"ContainerDied","Data":"70e209647f7551d6272a3ac30e0859347eebcc99fa942f6aad342ec1b6efefa2"} Mar 19 12:28:24.531275 master-0 kubenswrapper[29612]: I0319 12:28:24.531203 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70e209647f7551d6272a3ac30e0859347eebcc99fa942f6aad342ec1b6efefa2" Mar 19 12:28:24.534358 master-0 kubenswrapper[29612]: I0319 12:28:24.534325 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" event={"ID":"2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8","Type":"ContainerDied","Data":"20adc537e0f04b9ff7df8d82b10e04c677fcc34b034cd6a94429b139f76ce202"} Mar 19 12:28:24.534453 master-0 kubenswrapper[29612]: I0319 12:28:24.534360 29612 scope.go:117] "RemoveContainer" containerID="81709da4c380d1a7f298cd9c18ef65a4c5bfa7172b54ea6bd8526429aadb1cd5" Mar 19 12:28:24.534501 master-0 kubenswrapper[29612]: I0319 12:28:24.534465 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b8649b7f9-5jc6x" Mar 19 12:28:24.537252 master-0 kubenswrapper[29612]: I0319 12:28:24.537172 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lnqx" event={"ID":"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28","Type":"ContainerStarted","Data":"278453619bf2b805f5897b30e86b9fb756ac65d41e7eaa035f5655c8bab23424"} Mar 19 12:28:24.539359 master-0 kubenswrapper[29612]: I0319 12:28:24.539043 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"535fe30f-e49c-4659-a342-978cf8d49d25","Type":"ContainerStarted","Data":"33e5f865b62f8232a156c76c5ff3d4aa2f8f3757778588e1beb58eda1424032b"} Mar 19 12:28:24.542155 master-0 kubenswrapper[29612]: I0319 12:28:24.542115 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-w5dht" event={"ID":"6dd57bb3-eafc-41af-a7a0-794cd736adaa","Type":"ContainerStarted","Data":"34300180a75ac4b39061bad716c65295bada1c35603bd353b1c466b6a01953e0"} Mar 19 12:28:24.547477 master-0 kubenswrapper[29612]: I0319 12:28:24.547444 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.547477 master-0 kubenswrapper[29612]: I0319 12:28:24.547474 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.547679 master-0 kubenswrapper[29612]: I0319 12:28:24.547486 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.666787 master-0 kubenswrapper[29612]: I0319 12:28:24.666743 29612 scope.go:117] "RemoveContainer" containerID="4caa14a221cfe68e308c97fc512647a358b5baa77698d372fe983c3e8aaf9cfb" Mar 19 12:28:24.682094 master-0 kubenswrapper[29612]: I0319 12:28:24.681935 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:24.716322 master-0 kubenswrapper[29612]: I0319 12:28:24.716236 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b8649b7f9-5jc6x"] Mar 19 12:28:24.741467 master-0 kubenswrapper[29612]: I0319 12:28:24.741390 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b8649b7f9-5jc6x"] Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.866360 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-config-data\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.866537 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-httpd-run\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.866572 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7z2\" (UniqueName: \"kubernetes.io/projected/a64bde84-ef1a-4c46-bd83-ea307dfd5430-kube-api-access-7b7z2\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.866591 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-combined-ca-bundle\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.866948 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-scripts\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.867130 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.867251 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-public-tls-certs\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.867504 master-0 kubenswrapper[29612]: I0319 12:28:24.867316 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-logs\") pod \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\" (UID: \"a64bde84-ef1a-4c46-bd83-ea307dfd5430\") " Mar 19 12:28:24.868655 master-0 kubenswrapper[29612]: I0319 12:28:24.868251 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-logs" (OuterVolumeSpecName: "logs") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:24.868655 master-0 kubenswrapper[29612]: I0319 12:28:24.868457 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:24.872203 master-0 kubenswrapper[29612]: I0319 12:28:24.872111 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64bde84-ef1a-4c46-bd83-ea307dfd5430-kube-api-access-7b7z2" (OuterVolumeSpecName: "kube-api-access-7b7z2") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "kube-api-access-7b7z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:24.873650 master-0 kubenswrapper[29612]: I0319 12:28:24.873573 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-scripts" (OuterVolumeSpecName: "scripts") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:24.923187 master-0 kubenswrapper[29612]: I0319 12:28:24.923129 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" path="/var/lib/kubelet/pods/2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8/volumes" Mar 19 12:28:24.970405 master-0 kubenswrapper[29612]: I0319 12:28:24.970260 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.970405 master-0 kubenswrapper[29612]: I0319 12:28:24.970310 29612 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a64bde84-ef1a-4c46-bd83-ea307dfd5430-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.970405 master-0 kubenswrapper[29612]: I0319 12:28:24.970324 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7z2\" (UniqueName: \"kubernetes.io/projected/a64bde84-ef1a-4c46-bd83-ea307dfd5430-kube-api-access-7b7z2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:24.970405 master-0 kubenswrapper[29612]: I0319 12:28:24.970334 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.244434 master-0 kubenswrapper[29612]: I0319 12:28:25.244009 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:25.263894 master-0 kubenswrapper[29612]: I0319 12:28:25.263854 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h9cqc" Mar 19 12:28:25.270095 master-0 kubenswrapper[29612]: I0319 12:28:25.269493 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:25.270095 master-0 kubenswrapper[29612]: I0319 12:28:25.269721 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81" (OuterVolumeSpecName: "glance") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.274683 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p5c4\" (UniqueName: \"kubernetes.io/projected/3a976515-4dc5-45d4-b047-92e92de29df6-kube-api-access-5p5c4\") pod \"3a976515-4dc5-45d4-b047-92e92de29df6\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.274815 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-config-data\") pod \"3a976515-4dc5-45d4-b047-92e92de29df6\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.274854 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a976515-4dc5-45d4-b047-92e92de29df6-logs\") pod \"3a976515-4dc5-45d4-b047-92e92de29df6\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.274944 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-scripts\") pod \"3a976515-4dc5-45d4-b047-92e92de29df6\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.274999 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-combined-ca-bundle\") pod \"3a976515-4dc5-45d4-b047-92e92de29df6\" (UID: \"3a976515-4dc5-45d4-b047-92e92de29df6\") " Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.275391 29612 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.275410 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.275437 29612 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") on node \"master-0\" " Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.276772 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-config-data" (OuterVolumeSpecName: "config-data") pod "a64bde84-ef1a-4c46-bd83-ea307dfd5430" (UID: "a64bde84-ef1a-4c46-bd83-ea307dfd5430"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:25.277606 master-0 kubenswrapper[29612]: I0319 12:28:25.277031 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a976515-4dc5-45d4-b047-92e92de29df6-logs" (OuterVolumeSpecName: "logs") pod "3a976515-4dc5-45d4-b047-92e92de29df6" (UID: "3a976515-4dc5-45d4-b047-92e92de29df6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:25.279968 master-0 kubenswrapper[29612]: I0319 12:28:25.278905 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a976515-4dc5-45d4-b047-92e92de29df6-kube-api-access-5p5c4" (OuterVolumeSpecName: "kube-api-access-5p5c4") pod "3a976515-4dc5-45d4-b047-92e92de29df6" (UID: "3a976515-4dc5-45d4-b047-92e92de29df6"). InnerVolumeSpecName "kube-api-access-5p5c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:25.298881 master-0 kubenswrapper[29612]: I0319 12:28:25.298808 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-scripts" (OuterVolumeSpecName: "scripts") pod "3a976515-4dc5-45d4-b047-92e92de29df6" (UID: "3a976515-4dc5-45d4-b047-92e92de29df6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:25.314139 master-0 kubenswrapper[29612]: I0319 12:28:25.310686 29612 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 12:28:25.314139 master-0 kubenswrapper[29612]: I0319 12:28:25.310862 29612 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf" (UniqueName: "kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81") on node "master-0" Mar 19 12:28:25.331244 master-0 kubenswrapper[29612]: I0319 12:28:25.331189 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-config-data" (OuterVolumeSpecName: "config-data") pod "3a976515-4dc5-45d4-b047-92e92de29df6" (UID: "3a976515-4dc5-45d4-b047-92e92de29df6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:25.337721 master-0 kubenswrapper[29612]: I0319 12:28:25.337158 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a976515-4dc5-45d4-b047-92e92de29df6" (UID: "3a976515-4dc5-45d4-b047-92e92de29df6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:25.376851 master-0 kubenswrapper[29612]: I0319 12:28:25.376801 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p5c4\" (UniqueName: \"kubernetes.io/projected/3a976515-4dc5-45d4-b047-92e92de29df6-kube-api-access-5p5c4\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.376851 master-0 kubenswrapper[29612]: I0319 12:28:25.376843 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.376851 master-0 kubenswrapper[29612]: I0319 12:28:25.376854 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a976515-4dc5-45d4-b047-92e92de29df6-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.377107 master-0 kubenswrapper[29612]: I0319 12:28:25.376865 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a64bde84-ef1a-4c46-bd83-ea307dfd5430-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.377107 master-0 kubenswrapper[29612]: I0319 12:28:25.376873 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.377107 master-0 kubenswrapper[29612]: I0319 12:28:25.376882 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a976515-4dc5-45d4-b047-92e92de29df6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.377107 master-0 kubenswrapper[29612]: I0319 12:28:25.376890 29612 reconciler_common.go:293] "Volume detached for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:25.554984 master-0 kubenswrapper[29612]: I0319 12:28:25.554858 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-db-sync-dx9n4" event={"ID":"9f8cc63c-1080-41d8-b789-bce87f57e603","Type":"ContainerStarted","Data":"ae453dc2ffdbfbe42a2be37ef294607e2cf3448b1125e0bb91b31f7b7af37f65"} Mar 19 12:28:25.560732 master-0 kubenswrapper[29612]: I0319 12:28:25.560689 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h9cqc" event={"ID":"3a976515-4dc5-45d4-b047-92e92de29df6","Type":"ContainerDied","Data":"272134e6b14534039e98f0ebc3d4621869c16a5e9d33dc7bcdb9de16c5f126c5"} Mar 19 12:28:25.560997 master-0 kubenswrapper[29612]: I0319 12:28:25.560976 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="272134e6b14534039e98f0ebc3d4621869c16a5e9d33dc7bcdb9de16c5f126c5" Mar 19 12:28:25.561093 master-0 kubenswrapper[29612]: I0319 12:28:25.560774 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h9cqc" Mar 19 12:28:25.568432 master-0 kubenswrapper[29612]: I0319 12:28:25.568364 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lnqx" event={"ID":"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28","Type":"ContainerStarted","Data":"bd6a2673bbdbfb4e635f19ec1be9ad0d0cf036b7f77b5a7cbf792de9b09caf54"} Mar 19 12:28:25.578801 master-0 kubenswrapper[29612]: I0319 12:28:25.578200 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"535fe30f-e49c-4659-a342-978cf8d49d25","Type":"ContainerStarted","Data":"354b6849b85b9548735949c583ed8f9ab43a9f9009c0afe1b9d88606bace8530"} Mar 19 12:28:25.578801 master-0 kubenswrapper[29612]: I0319 12:28:25.578253 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.590078 master-0 kubenswrapper[29612]: I0319 12:28:25.587916 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-db-sync-dx9n4" podStartSLOduration=6.918109813 podStartE2EDuration="32.587898401s" podCreationTimestamp="2026-03-19 12:27:53 +0000 UTC" firstStartedPulling="2026-03-19 12:27:58.391796601 +0000 UTC m=+1085.705965622" lastFinishedPulling="2026-03-19 12:28:24.061585189 +0000 UTC m=+1111.375754210" observedRunningTime="2026-03-19 12:28:25.576936149 +0000 UTC m=+1112.891105170" watchObservedRunningTime="2026-03-19 12:28:25.587898401 +0000 UTC m=+1112.902067422" Mar 19 12:28:25.619496 master-0 kubenswrapper[29612]: I0319 12:28:25.619414 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8lnqx" podStartSLOduration=12.619393756000001 podStartE2EDuration="12.619393756s" podCreationTimestamp="2026-03-19 12:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:25.607533089 +0000 UTC m=+1112.921702120" watchObservedRunningTime="2026-03-19 12:28:25.619393756 +0000 UTC m=+1112.933562777" Mar 19 12:28:25.692997 master-0 kubenswrapper[29612]: I0319 12:28:25.692933 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:25.711194 master-0 kubenswrapper[29612]: I0319 12:28:25.711154 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758010 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: E0319 12:28:25.758476 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a976515-4dc5-45d4-b047-92e92de29df6" containerName="placement-db-sync" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758490 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a976515-4dc5-45d4-b047-92e92de29df6" containerName="placement-db-sync" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: E0319 12:28:25.758522 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-httpd" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758529 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-httpd" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: E0319 12:28:25.758544 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="init" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758552 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="init" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: E0319 12:28:25.758565 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-log" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758572 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-log" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: E0319 12:28:25.758599 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="dnsmasq-dns" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758605 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="dnsmasq-dns" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758819 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-httpd" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758847 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2669f8bc-1bcb-4b9f-9c39-056e07e1a5a8" containerName="dnsmasq-dns" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758862 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a976515-4dc5-45d4-b047-92e92de29df6" containerName="placement-db-sync" Mar 19 12:28:25.759196 master-0 kubenswrapper[29612]: I0319 12:28:25.758888 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" containerName="glance-log" Mar 19 12:28:25.759947 master-0 kubenswrapper[29612]: I0319 12:28:25.759890 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.768761 master-0 kubenswrapper[29612]: I0319 12:28:25.766410 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 12:28:25.768761 master-0 kubenswrapper[29612]: I0319 12:28:25.766621 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-default-external-config-data" Mar 19 12:28:25.784229 master-0 kubenswrapper[29612]: I0319 12:28:25.783354 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:25.889381 master-0 kubenswrapper[29612]: I0319 12:28:25.889313 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889381 master-0 kubenswrapper[29612]: I0319 12:28:25.889393 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889770 master-0 kubenswrapper[29612]: I0319 12:28:25.889468 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889770 master-0 kubenswrapper[29612]: I0319 12:28:25.889492 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889770 master-0 kubenswrapper[29612]: I0319 12:28:25.889538 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s82wt\" (UniqueName: \"kubernetes.io/projected/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-kube-api-access-s82wt\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889770 master-0 kubenswrapper[29612]: I0319 12:28:25.889566 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889770 master-0 kubenswrapper[29612]: I0319 12:28:25.889598 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.889770 master-0 kubenswrapper[29612]: I0319 12:28:25.889699 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.991923 master-0 kubenswrapper[29612]: I0319 12:28:25.991851 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.992663 master-0 kubenswrapper[29612]: I0319 12:28:25.992591 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.992803 master-0 kubenswrapper[29612]: I0319 12:28:25.992780 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.993012 master-0 kubenswrapper[29612]: I0319 12:28:25.992989 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.993127 master-0 kubenswrapper[29612]: I0319 12:28:25.993106 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.993276 master-0 kubenswrapper[29612]: I0319 12:28:25.993257 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s82wt\" (UniqueName: \"kubernetes.io/projected/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-kube-api-access-s82wt\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.993395 master-0 kubenswrapper[29612]: I0319 12:28:25.993377 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.993510 master-0 kubenswrapper[29612]: I0319 12:28:25.993493 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.994336 master-0 kubenswrapper[29612]: I0319 12:28:25.994312 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.997678 master-0 kubenswrapper[29612]: I0319 12:28:25.997578 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.998492 master-0 kubenswrapper[29612]: I0319 12:28:25.998307 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:28:25.998492 master-0 kubenswrapper[29612]: I0319 12:28:25.998335 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/9a90be70eddb52d75ac120c1599a27499c5565c88a66c71224351799b28411da/globalmount\"" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:25.998492 master-0 kubenswrapper[29612]: I0319 12:28:25.998409 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:26.004570 master-0 kubenswrapper[29612]: I0319 12:28:26.004515 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:26.005152 master-0 kubenswrapper[29612]: I0319 12:28:26.005108 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:26.005372 master-0 kubenswrapper[29612]: I0319 12:28:26.005324 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:26.018417 master-0 kubenswrapper[29612]: I0319 12:28:26.018355 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s82wt\" (UniqueName: \"kubernetes.io/projected/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-kube-api-access-s82wt\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:26.465824 master-0 kubenswrapper[29612]: I0319 12:28:26.461100 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-66dc6f86db-rpr5q"] Mar 19 12:28:26.477383 master-0 kubenswrapper[29612]: I0319 12:28:26.468512 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.482612 master-0 kubenswrapper[29612]: I0319 12:28:26.482566 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 12:28:26.485439 master-0 kubenswrapper[29612]: I0319 12:28:26.485319 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 12:28:26.485827 master-0 kubenswrapper[29612]: I0319 12:28:26.485802 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 12:28:26.485956 master-0 kubenswrapper[29612]: I0319 12:28:26.485909 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 12:28:26.508330 master-0 kubenswrapper[29612]: I0319 12:28:26.507382 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66dc6f86db-rpr5q"] Mar 19 12:28:26.628035 master-0 kubenswrapper[29612]: I0319 12:28:26.627975 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcee29fd-64dc-454f-830c-d3bd3280a07a-logs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.628591 master-0 kubenswrapper[29612]: I0319 12:28:26.628047 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-internal-tls-certs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.628591 master-0 kubenswrapper[29612]: I0319 12:28:26.628125 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-config-data\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.628591 master-0 kubenswrapper[29612]: I0319 12:28:26.628161 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-scripts\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.628591 master-0 kubenswrapper[29612]: I0319 12:28:26.628214 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-combined-ca-bundle\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.628591 master-0 kubenswrapper[29612]: I0319 12:28:26.628395 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-public-tls-certs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.628591 master-0 kubenswrapper[29612]: I0319 12:28:26.628431 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9r5x\" (UniqueName: \"kubernetes.io/projected/fcee29fd-64dc-454f-830c-d3bd3280a07a-kube-api-access-f9r5x\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.657336 master-0 kubenswrapper[29612]: I0319 12:28:26.656933 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"535fe30f-e49c-4659-a342-978cf8d49d25","Type":"ContainerStarted","Data":"b6b9739daa5abc26704e140533520ffa7ea17ad8ac5c63a19d594116782b5d8a"} Mar 19 12:28:26.733068 master-0 kubenswrapper[29612]: I0319 12:28:26.733001 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-scripts\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.733301 master-0 kubenswrapper[29612]: I0319 12:28:26.733123 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-combined-ca-bundle\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.733301 master-0 kubenswrapper[29612]: I0319 12:28:26.733154 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-public-tls-certs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.733301 master-0 kubenswrapper[29612]: I0319 12:28:26.733122 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3abc0-default-internal-api-0" podStartSLOduration=13.733100534 podStartE2EDuration="13.733100534s" podCreationTimestamp="2026-03-19 12:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:26.727109064 +0000 UTC m=+1114.041278085" watchObservedRunningTime="2026-03-19 12:28:26.733100534 +0000 UTC m=+1114.047269565" Mar 19 12:28:26.743721 master-0 kubenswrapper[29612]: I0319 12:28:26.741964 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-combined-ca-bundle\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.743721 master-0 kubenswrapper[29612]: I0319 12:28:26.742039 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9r5x\" (UniqueName: \"kubernetes.io/projected/fcee29fd-64dc-454f-830c-d3bd3280a07a-kube-api-access-f9r5x\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.743721 master-0 kubenswrapper[29612]: I0319 12:28:26.742193 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcee29fd-64dc-454f-830c-d3bd3280a07a-logs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.743721 master-0 kubenswrapper[29612]: I0319 12:28:26.742256 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-internal-tls-certs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.743721 master-0 kubenswrapper[29612]: I0319 12:28:26.742402 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-config-data\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.747665 master-0 kubenswrapper[29612]: I0319 12:28:26.747186 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcee29fd-64dc-454f-830c-d3bd3280a07a-logs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.751652 master-0 kubenswrapper[29612]: I0319 12:28:26.750919 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-scripts\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.755652 master-0 kubenswrapper[29612]: I0319 12:28:26.752471 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-config-data\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.777439 master-0 kubenswrapper[29612]: I0319 12:28:26.774190 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-public-tls-certs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.777439 master-0 kubenswrapper[29612]: I0319 12:28:26.774327 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-internal-tls-certs\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.792501 master-0 kubenswrapper[29612]: I0319 12:28:26.792443 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9r5x\" (UniqueName: \"kubernetes.io/projected/fcee29fd-64dc-454f-830c-d3bd3280a07a-kube-api-access-f9r5x\") pod \"placement-66dc6f86db-rpr5q\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.840986 master-0 kubenswrapper[29612]: I0319 12:28:26.840400 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:26.971658 master-0 kubenswrapper[29612]: I0319 12:28:26.962529 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64bde84-ef1a-4c46-bd83-ea307dfd5430" path="/var/lib/kubelet/pods/a64bde84-ef1a-4c46-bd83-ea307dfd5430/volumes" Mar 19 12:28:27.064195 master-0 kubenswrapper[29612]: I0319 12:28:27.064132 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:27.359017 master-0 kubenswrapper[29612]: I0319 12:28:27.356300 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-66dc6f86db-rpr5q"] Mar 19 12:28:27.359017 master-0 kubenswrapper[29612]: I0319 12:28:27.357608 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:27.362211 master-0 kubenswrapper[29612]: W0319 12:28:27.360893 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcee29fd_64dc_454f_830c_d3bd3280a07a.slice/crio-6f0614e06d08d9c42c114c7926bd1a843f22224314cb8afef323681aefb94623 WatchSource:0}: Error finding container 6f0614e06d08d9c42c114c7926bd1a843f22224314cb8afef323681aefb94623: Status 404 returned error can't find the container with id 6f0614e06d08d9c42c114c7926bd1a843f22224314cb8afef323681aefb94623 Mar 19 12:28:27.670805 master-0 kubenswrapper[29612]: I0319 12:28:27.670751 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66dc6f86db-rpr5q" event={"ID":"fcee29fd-64dc-454f-830c-d3bd3280a07a","Type":"ContainerStarted","Data":"6f0614e06d08d9c42c114c7926bd1a843f22224314cb8afef323681aefb94623"} Mar 19 12:28:28.034590 master-0 kubenswrapper[29612]: I0319 12:28:28.034480 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:28:28.046962 master-0 kubenswrapper[29612]: W0319 12:28:28.046907 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a1e1d5_9787_4dfc_96fe_367b409dcf64.slice/crio-deb8f83677dcd746b5ab6a55acbe81d6bca351bc3e85058f6ca3e9393ffb2b60 WatchSource:0}: Error finding container deb8f83677dcd746b5ab6a55acbe81d6bca351bc3e85058f6ca3e9393ffb2b60: Status 404 returned error can't find the container with id deb8f83677dcd746b5ab6a55acbe81d6bca351bc3e85058f6ca3e9393ffb2b60 Mar 19 12:28:28.686809 master-0 kubenswrapper[29612]: I0319 12:28:28.686743 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66dc6f86db-rpr5q" event={"ID":"fcee29fd-64dc-454f-830c-d3bd3280a07a","Type":"ContainerStarted","Data":"3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1"} Mar 19 12:28:28.686809 master-0 kubenswrapper[29612]: I0319 12:28:28.686811 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66dc6f86db-rpr5q" event={"ID":"fcee29fd-64dc-454f-830c-d3bd3280a07a","Type":"ContainerStarted","Data":"c98228c403bfc9077719614a45ae48650c4de1acac68426a85b93ba6d1e3a33c"} Mar 19 12:28:28.687451 master-0 kubenswrapper[29612]: I0319 12:28:28.686861 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:28.687451 master-0 kubenswrapper[29612]: I0319 12:28:28.686899 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:28.691083 master-0 kubenswrapper[29612]: I0319 12:28:28.691006 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"f8a1e1d5-9787-4dfc-96fe-367b409dcf64","Type":"ContainerStarted","Data":"bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188"} Mar 19 12:28:28.691083 master-0 kubenswrapper[29612]: I0319 12:28:28.691084 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"f8a1e1d5-9787-4dfc-96fe-367b409dcf64","Type":"ContainerStarted","Data":"deb8f83677dcd746b5ab6a55acbe81d6bca351bc3e85058f6ca3e9393ffb2b60"} Mar 19 12:28:28.693529 master-0 kubenswrapper[29612]: I0319 12:28:28.693481 29612 generic.go:334] "Generic (PLEG): container finished" podID="1d744992-9162-4633-9614-ed0c4ac81988" containerID="79cff36b6685fae615be3f2775f02a6c1d208b32ddea60b555a2d59e2b70797b" exitCode=0 Mar 19 12:28:28.693529 master-0 kubenswrapper[29612]: I0319 12:28:28.693517 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mh455" event={"ID":"1d744992-9162-4633-9614-ed0c4ac81988","Type":"ContainerDied","Data":"79cff36b6685fae615be3f2775f02a6c1d208b32ddea60b555a2d59e2b70797b"} Mar 19 12:28:28.757064 master-0 kubenswrapper[29612]: I0319 12:28:28.756505 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-66dc6f86db-rpr5q" podStartSLOduration=2.7564763709999998 podStartE2EDuration="2.756476371s" podCreationTimestamp="2026-03-19 12:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:28.713047907 +0000 UTC m=+1116.027216928" watchObservedRunningTime="2026-03-19 12:28:28.756476371 +0000 UTC m=+1116.070645392" Mar 19 12:28:31.734874 master-0 kubenswrapper[29612]: I0319 12:28:31.734807 29612 generic.go:334] "Generic (PLEG): container finished" podID="e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" containerID="bd6a2673bbdbfb4e635f19ec1be9ad0d0cf036b7f77b5a7cbf792de9b09caf54" exitCode=0 Mar 19 12:28:31.734874 master-0 kubenswrapper[29612]: I0319 12:28:31.734870 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lnqx" event={"ID":"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28","Type":"ContainerDied","Data":"bd6a2673bbdbfb4e635f19ec1be9ad0d0cf036b7f77b5a7cbf792de9b09caf54"} Mar 19 12:28:32.601149 master-0 kubenswrapper[29612]: I0319 12:28:32.601101 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mh455" Mar 19 12:28:32.679420 master-0 kubenswrapper[29612]: I0319 12:28:32.679116 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-config\") pod \"1d744992-9162-4633-9614-ed0c4ac81988\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " Mar 19 12:28:32.679420 master-0 kubenswrapper[29612]: I0319 12:28:32.679333 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phlkz\" (UniqueName: \"kubernetes.io/projected/1d744992-9162-4633-9614-ed0c4ac81988-kube-api-access-phlkz\") pod \"1d744992-9162-4633-9614-ed0c4ac81988\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " Mar 19 12:28:32.679420 master-0 kubenswrapper[29612]: I0319 12:28:32.679369 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-combined-ca-bundle\") pod \"1d744992-9162-4633-9614-ed0c4ac81988\" (UID: \"1d744992-9162-4633-9614-ed0c4ac81988\") " Mar 19 12:28:32.686294 master-0 kubenswrapper[29612]: I0319 12:28:32.685495 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d744992-9162-4633-9614-ed0c4ac81988-kube-api-access-phlkz" (OuterVolumeSpecName: "kube-api-access-phlkz") pod "1d744992-9162-4633-9614-ed0c4ac81988" (UID: "1d744992-9162-4633-9614-ed0c4ac81988"). InnerVolumeSpecName "kube-api-access-phlkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:32.713356 master-0 kubenswrapper[29612]: I0319 12:28:32.713278 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-config" (OuterVolumeSpecName: "config") pod "1d744992-9162-4633-9614-ed0c4ac81988" (UID: "1d744992-9162-4633-9614-ed0c4ac81988"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:32.726973 master-0 kubenswrapper[29612]: I0319 12:28:32.726929 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d744992-9162-4633-9614-ed0c4ac81988" (UID: "1d744992-9162-4633-9614-ed0c4ac81988"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:32.781183 master-0 kubenswrapper[29612]: I0319 12:28:32.781135 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phlkz\" (UniqueName: \"kubernetes.io/projected/1d744992-9162-4633-9614-ed0c4ac81988-kube-api-access-phlkz\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:32.781183 master-0 kubenswrapper[29612]: I0319 12:28:32.781176 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:32.781183 master-0 kubenswrapper[29612]: I0319 12:28:32.781190 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1d744992-9162-4633-9614-ed0c4ac81988-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:32.783930 master-0 kubenswrapper[29612]: I0319 12:28:32.783785 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-w5dht" event={"ID":"6dd57bb3-eafc-41af-a7a0-794cd736adaa","Type":"ContainerStarted","Data":"094ec1b0c8b53cedaa70bfce9b1acce163669311c988220e4e3b8c52cadd03a1"} Mar 19 12:28:32.793878 master-0 kubenswrapper[29612]: I0319 12:28:32.793835 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mh455" Mar 19 12:28:32.794108 master-0 kubenswrapper[29612]: I0319 12:28:32.793829 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mh455" event={"ID":"1d744992-9162-4633-9614-ed0c4ac81988","Type":"ContainerDied","Data":"74d4863262f99703a760e4d820baf31985f4ecfe76f447b2b6cf25047c4a24d1"} Mar 19 12:28:32.794108 master-0 kubenswrapper[29612]: I0319 12:28:32.794102 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d4863262f99703a760e4d820baf31985f4ecfe76f447b2b6cf25047c4a24d1" Mar 19 12:28:32.814573 master-0 kubenswrapper[29612]: I0319 12:28:32.813299 29612 generic.go:334] "Generic (PLEG): container finished" podID="9f8cc63c-1080-41d8-b789-bce87f57e603" containerID="ae453dc2ffdbfbe42a2be37ef294607e2cf3448b1125e0bb91b31f7b7af37f65" exitCode=0 Mar 19 12:28:32.814573 master-0 kubenswrapper[29612]: I0319 12:28:32.813683 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-db-sync-dx9n4" event={"ID":"9f8cc63c-1080-41d8-b789-bce87f57e603","Type":"ContainerDied","Data":"ae453dc2ffdbfbe42a2be37ef294607e2cf3448b1125e0bb91b31f7b7af37f65"} Mar 19 12:28:33.043085 master-0 kubenswrapper[29612]: E0319 12:28:33.042864 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d744992_9162_4633_9614_ed0c4ac81988.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d744992_9162_4633_9614_ed0c4ac81988.slice/crio-74d4863262f99703a760e4d820baf31985f4ecfe76f447b2b6cf25047c4a24d1\": RecentStats: unable to find data in memory cache]" Mar 19 12:28:33.172095 master-0 kubenswrapper[29612]: I0319 12:28:33.172054 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:33.290068 master-0 kubenswrapper[29612]: I0319 12:28:33.290015 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88gjm\" (UniqueName: \"kubernetes.io/projected/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-kube-api-access-88gjm\") pod \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " Mar 19 12:28:33.290876 master-0 kubenswrapper[29612]: I0319 12:28:33.290075 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-credential-keys\") pod \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " Mar 19 12:28:33.290876 master-0 kubenswrapper[29612]: I0319 12:28:33.290181 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-config-data\") pod \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " Mar 19 12:28:33.290876 master-0 kubenswrapper[29612]: I0319 12:28:33.290253 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-combined-ca-bundle\") pod \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " Mar 19 12:28:33.290876 master-0 kubenswrapper[29612]: I0319 12:28:33.290352 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-scripts\") pod \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " Mar 19 12:28:33.290876 master-0 kubenswrapper[29612]: I0319 12:28:33.290414 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-fernet-keys\") pod \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\" (UID: \"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28\") " Mar 19 12:28:33.294316 master-0 kubenswrapper[29612]: I0319 12:28:33.294189 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-scripts" (OuterVolumeSpecName: "scripts") pod "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" (UID: "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:33.294437 master-0 kubenswrapper[29612]: I0319 12:28:33.294317 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" (UID: "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:33.295101 master-0 kubenswrapper[29612]: I0319 12:28:33.295073 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-kube-api-access-88gjm" (OuterVolumeSpecName: "kube-api-access-88gjm") pod "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" (UID: "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28"). InnerVolumeSpecName "kube-api-access-88gjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:33.299230 master-0 kubenswrapper[29612]: I0319 12:28:33.299186 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" (UID: "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:33.325574 master-0 kubenswrapper[29612]: I0319 12:28:33.325508 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" (UID: "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:33.331861 master-0 kubenswrapper[29612]: I0319 12:28:33.330755 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-config-data" (OuterVolumeSpecName: "config-data") pod "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" (UID: "e5a533ad-81e0-4e73-b4a9-3dcda2fbad28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:33.394157 master-0 kubenswrapper[29612]: I0319 12:28:33.394099 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:33.394406 master-0 kubenswrapper[29612]: I0319 12:28:33.394389 29612 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:33.394528 master-0 kubenswrapper[29612]: I0319 12:28:33.394512 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88gjm\" (UniqueName: \"kubernetes.io/projected/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-kube-api-access-88gjm\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:33.394677 master-0 kubenswrapper[29612]: I0319 12:28:33.394620 29612 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:33.394792 master-0 kubenswrapper[29612]: I0319 12:28:33.394776 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:33.394906 master-0 kubenswrapper[29612]: I0319 12:28:33.394891 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:33.925031 master-0 kubenswrapper[29612]: I0319 12:28:33.924967 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lnqx" event={"ID":"e5a533ad-81e0-4e73-b4a9-3dcda2fbad28","Type":"ContainerDied","Data":"278453619bf2b805f5897b30e86b9fb756ac65d41e7eaa035f5655c8bab23424"} Mar 19 12:28:33.925031 master-0 kubenswrapper[29612]: I0319 12:28:33.925025 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278453619bf2b805f5897b30e86b9fb756ac65d41e7eaa035f5655c8bab23424" Mar 19 12:28:33.925781 master-0 kubenswrapper[29612]: I0319 12:28:33.925114 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lnqx" Mar 19 12:28:33.940357 master-0 kubenswrapper[29612]: I0319 12:28:33.936768 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"f8a1e1d5-9787-4dfc-96fe-367b409dcf64","Type":"ContainerStarted","Data":"5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d"} Mar 19 12:28:33.983605 master-0 kubenswrapper[29612]: I0319 12:28:33.983533 29612 generic.go:334] "Generic (PLEG): container finished" podID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerID="094ec1b0c8b53cedaa70bfce9b1acce163669311c988220e4e3b8c52cadd03a1" exitCode=0 Mar 19 12:28:33.989347 master-0 kubenswrapper[29612]: I0319 12:28:33.984882 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-w5dht" event={"ID":"6dd57bb3-eafc-41af-a7a0-794cd736adaa","Type":"ContainerDied","Data":"094ec1b0c8b53cedaa70bfce9b1acce163669311c988220e4e3b8c52cadd03a1"} Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: I0319 12:28:34.017303 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77f9c5fb97-m9qdj"] Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: E0319 12:28:34.019478 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" containerName="keystone-bootstrap" Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: I0319 12:28:34.019556 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" containerName="keystone-bootstrap" Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: E0319 12:28:34.019619 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d744992-9162-4633-9614-ed0c4ac81988" containerName="neutron-db-sync" Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: I0319 12:28:34.019725 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d744992-9162-4633-9614-ed0c4ac81988" containerName="neutron-db-sync" Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: I0319 12:28:34.020219 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" containerName="keystone-bootstrap" Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: I0319 12:28:34.020258 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d744992-9162-4633-9614-ed0c4ac81988" containerName="neutron-db-sync" Mar 19 12:28:34.037670 master-0 kubenswrapper[29612]: I0319 12:28:34.031105 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.066920 master-0 kubenswrapper[29612]: I0319 12:28:34.059470 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f9c5fb97-m9qdj"] Mar 19 12:28:34.108119 master-0 kubenswrapper[29612]: I0319 12:28:34.100058 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3abc0-default-external-api-0" podStartSLOduration=9.100036757 podStartE2EDuration="9.100036757s" podCreationTimestamp="2026-03-19 12:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:33.998103331 +0000 UTC m=+1121.312272352" watchObservedRunningTime="2026-03-19 12:28:34.100036757 +0000 UTC m=+1121.414205778" Mar 19 12:28:34.108119 master-0 kubenswrapper[29612]: I0319 12:28:34.107222 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b858fd547-mflsp"] Mar 19 12:28:34.110223 master-0 kubenswrapper[29612]: I0319 12:28:34.109772 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.123766 master-0 kubenswrapper[29612]: I0319 12:28:34.112462 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 12:28:34.123766 master-0 kubenswrapper[29612]: I0319 12:28:34.112603 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 12:28:34.123766 master-0 kubenswrapper[29612]: I0319 12:28:34.122117 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-config\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.124254 master-0 kubenswrapper[29612]: I0319 12:28:34.124224 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 12:28:34.125772 master-0 kubenswrapper[29612]: I0319 12:28:34.125232 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.126397 master-0 kubenswrapper[29612]: I0319 12:28:34.126371 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 12:28:34.127022 master-0 kubenswrapper[29612]: I0319 12:28:34.126965 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-svc\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.127112 master-0 kubenswrapper[29612]: I0319 12:28:34.126754 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 12:28:34.127291 master-0 kubenswrapper[29612]: I0319 12:28:34.127111 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58q8t\" (UniqueName: \"kubernetes.io/projected/4a48162e-0a2e-4b9d-b470-0346024a7c7e-kube-api-access-58q8t\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.127520 master-0 kubenswrapper[29612]: I0319 12:28:34.127488 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.127840 master-0 kubenswrapper[29612]: I0319 12:28:34.127824 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.144803 master-0 kubenswrapper[29612]: I0319 12:28:34.144761 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b858fd547-mflsp"] Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229716 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-public-tls-certs\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229792 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229832 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-internal-tls-certs\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229882 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-config\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229921 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-config-data\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229950 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-scripts\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229971 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.229994 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62p4v\" (UniqueName: \"kubernetes.io/projected/1dd11416-6337-4735-8eda-d42a42a56c99-kube-api-access-62p4v\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.230029 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-svc\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.230049 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-combined-ca-bundle\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.231574 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-sb\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.230077 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-fernet-keys\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.233024 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-credential-keys\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.233028 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-nb\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.233099 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58q8t\" (UniqueName: \"kubernetes.io/projected/4a48162e-0a2e-4b9d-b470-0346024a7c7e-kube-api-access-58q8t\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.233700 master-0 kubenswrapper[29612]: I0319 12:28:34.233150 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.236678 master-0 kubenswrapper[29612]: I0319 12:28:34.235800 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-config\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.237390 master-0 kubenswrapper[29612]: I0319 12:28:34.237365 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-svc\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.242704 master-0 kubenswrapper[29612]: I0319 12:28:34.242604 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-swift-storage-0\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.267681 master-0 kubenswrapper[29612]: I0319 12:28:34.264973 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58q8t\" (UniqueName: \"kubernetes.io/projected/4a48162e-0a2e-4b9d-b470-0346024a7c7e-kube-api-access-58q8t\") pod \"dnsmasq-dns-77f9c5fb97-m9qdj\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.283293 master-0 kubenswrapper[29612]: I0319 12:28:34.283223 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77bcfcb874-5q7wl"] Mar 19 12:28:34.285510 master-0 kubenswrapper[29612]: I0319 12:28:34.285457 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.291741 master-0 kubenswrapper[29612]: I0319 12:28:34.289501 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 12:28:34.291741 master-0 kubenswrapper[29612]: I0319 12:28:34.289933 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 12:28:34.291741 master-0 kubenswrapper[29612]: I0319 12:28:34.289987 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 12:28:34.299802 master-0 kubenswrapper[29612]: I0319 12:28:34.299732 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77bcfcb874-5q7wl"] Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.364936 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-config\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365042 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-httpd-config\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365083 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-public-tls-certs\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365166 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-internal-tls-certs\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365213 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-ovndb-tls-certs\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365287 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwtx\" (UniqueName: \"kubernetes.io/projected/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-kube-api-access-7vwtx\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365326 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-config-data\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365357 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-combined-ca-bundle\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365390 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-scripts\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365432 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62p4v\" (UniqueName: \"kubernetes.io/projected/1dd11416-6337-4735-8eda-d42a42a56c99-kube-api-access-62p4v\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365494 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-combined-ca-bundle\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365534 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-fernet-keys\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.365561 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-credential-keys\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.371506 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-internal-tls-certs\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.371783 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-fernet-keys\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.372083 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-credential-keys\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.374187 master-0 kubenswrapper[29612]: I0319 12:28:34.373083 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-config-data\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.381913 master-0 kubenswrapper[29612]: I0319 12:28:34.376793 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-public-tls-certs\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.381913 master-0 kubenswrapper[29612]: I0319 12:28:34.377226 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-combined-ca-bundle\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.385793 master-0 kubenswrapper[29612]: I0319 12:28:34.384113 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dd11416-6337-4735-8eda-d42a42a56c99-scripts\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.387878 master-0 kubenswrapper[29612]: I0319 12:28:34.385977 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62p4v\" (UniqueName: \"kubernetes.io/projected/1dd11416-6337-4735-8eda-d42a42a56c99-kube-api-access-62p4v\") pod \"keystone-5b858fd547-mflsp\" (UID: \"1dd11416-6337-4735-8eda-d42a42a56c99\") " pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.468347 master-0 kubenswrapper[29612]: I0319 12:28:34.468287 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-config\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.468751 master-0 kubenswrapper[29612]: I0319 12:28:34.468729 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-httpd-config\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.468989 master-0 kubenswrapper[29612]: I0319 12:28:34.468969 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-ovndb-tls-certs\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.469184 master-0 kubenswrapper[29612]: I0319 12:28:34.469161 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwtx\" (UniqueName: \"kubernetes.io/projected/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-kube-api-access-7vwtx\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.469320 master-0 kubenswrapper[29612]: I0319 12:28:34.469303 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-combined-ca-bundle\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.475897 master-0 kubenswrapper[29612]: I0319 12:28:34.475582 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-ovndb-tls-certs\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.480237 master-0 kubenswrapper[29612]: I0319 12:28:34.479829 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-combined-ca-bundle\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.482260 master-0 kubenswrapper[29612]: I0319 12:28:34.482208 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-httpd-config\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.488347 master-0 kubenswrapper[29612]: I0319 12:28:34.488203 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-config\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.491981 master-0 kubenswrapper[29612]: I0319 12:28:34.491928 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwtx\" (UniqueName: \"kubernetes.io/projected/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-kube-api-access-7vwtx\") pod \"neutron-77bcfcb874-5q7wl\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.492830 master-0 kubenswrapper[29612]: I0319 12:28:34.492781 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:34.577173 master-0 kubenswrapper[29612]: I0319 12:28:34.577051 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:34.604005 master-0 kubenswrapper[29612]: I0319 12:28:34.603873 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b4b84764-w6p2r"] Mar 19 12:28:34.607294 master-0 kubenswrapper[29612]: I0319 12:28:34.607241 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.622720 master-0 kubenswrapper[29612]: I0319 12:28:34.622200 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b4b84764-w6p2r"] Mar 19 12:28:34.682933 master-0 kubenswrapper[29612]: I0319 12:28:34.682873 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:28:34.683149 master-0 kubenswrapper[29612]: I0319 12:28:34.682988 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-httpd-config\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.683149 master-0 kubenswrapper[29612]: I0319 12:28:34.683060 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-ovndb-tls-certs\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.683149 master-0 kubenswrapper[29612]: I0319 12:28:34.683096 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-combined-ca-bundle\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.683294 master-0 kubenswrapper[29612]: I0319 12:28:34.683253 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jkv\" (UniqueName: \"kubernetes.io/projected/37c49776-7c55-42c9-9765-18af1ce57ce4-kube-api-access-88jkv\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.683294 master-0 kubenswrapper[29612]: I0319 12:28:34.683286 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-config\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.776825 master-0 kubenswrapper[29612]: I0319 12:28:34.776471 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:34.787380 master-0 kubenswrapper[29612]: I0319 12:28:34.787218 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-db-sync-config-data\") pod \"9f8cc63c-1080-41d8-b789-bce87f57e603\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " Mar 19 12:28:34.788185 master-0 kubenswrapper[29612]: I0319 12:28:34.787408 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-scripts\") pod \"9f8cc63c-1080-41d8-b789-bce87f57e603\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " Mar 19 12:28:34.788185 master-0 kubenswrapper[29612]: I0319 12:28:34.787519 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f8cc63c-1080-41d8-b789-bce87f57e603-etc-machine-id\") pod \"9f8cc63c-1080-41d8-b789-bce87f57e603\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " Mar 19 12:28:34.788185 master-0 kubenswrapper[29612]: I0319 12:28:34.787950 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f8cc63c-1080-41d8-b789-bce87f57e603-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9f8cc63c-1080-41d8-b789-bce87f57e603" (UID: "9f8cc63c-1080-41d8-b789-bce87f57e603"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:34.788185 master-0 kubenswrapper[29612]: I0319 12:28:34.788134 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-config-data\") pod \"9f8cc63c-1080-41d8-b789-bce87f57e603\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " Mar 19 12:28:34.788185 master-0 kubenswrapper[29612]: I0319 12:28:34.788161 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-combined-ca-bundle\") pod \"9f8cc63c-1080-41d8-b789-bce87f57e603\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " Mar 19 12:28:34.788385 master-0 kubenswrapper[29612]: I0319 12:28:34.788188 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkzgh\" (UniqueName: \"kubernetes.io/projected/9f8cc63c-1080-41d8-b789-bce87f57e603-kube-api-access-mkzgh\") pod \"9f8cc63c-1080-41d8-b789-bce87f57e603\" (UID: \"9f8cc63c-1080-41d8-b789-bce87f57e603\") " Mar 19 12:28:34.789642 master-0 kubenswrapper[29612]: I0319 12:28:34.788574 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-httpd-config\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.789642 master-0 kubenswrapper[29612]: I0319 12:28:34.788623 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-ovndb-tls-certs\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.789642 master-0 kubenswrapper[29612]: I0319 12:28:34.788658 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-combined-ca-bundle\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.789642 master-0 kubenswrapper[29612]: I0319 12:28:34.788775 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jkv\" (UniqueName: \"kubernetes.io/projected/37c49776-7c55-42c9-9765-18af1ce57ce4-kube-api-access-88jkv\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.789642 master-0 kubenswrapper[29612]: I0319 12:28:34.788802 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-config\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.789642 master-0 kubenswrapper[29612]: I0319 12:28:34.788850 29612 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f8cc63c-1080-41d8-b789-bce87f57e603-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:34.796393 master-0 kubenswrapper[29612]: I0319 12:28:34.796335 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-ovndb-tls-certs\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.797864 master-0 kubenswrapper[29612]: I0319 12:28:34.796872 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-scripts" (OuterVolumeSpecName: "scripts") pod "9f8cc63c-1080-41d8-b789-bce87f57e603" (UID: "9f8cc63c-1080-41d8-b789-bce87f57e603"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:34.804954 master-0 kubenswrapper[29612]: I0319 12:28:34.804913 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-config\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.806445 master-0 kubenswrapper[29612]: I0319 12:28:34.806415 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-combined-ca-bundle\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.808720 master-0 kubenswrapper[29612]: I0319 12:28:34.808686 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jkv\" (UniqueName: \"kubernetes.io/projected/37c49776-7c55-42c9-9765-18af1ce57ce4-kube-api-access-88jkv\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.810987 master-0 kubenswrapper[29612]: I0319 12:28:34.810923 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8cc63c-1080-41d8-b789-bce87f57e603-kube-api-access-mkzgh" (OuterVolumeSpecName: "kube-api-access-mkzgh") pod "9f8cc63c-1080-41d8-b789-bce87f57e603" (UID: "9f8cc63c-1080-41d8-b789-bce87f57e603"). InnerVolumeSpecName "kube-api-access-mkzgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:34.815815 master-0 kubenswrapper[29612]: I0319 12:28:34.815773 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-httpd-config\") pod \"neutron-5b4b84764-w6p2r\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:34.819981 master-0 kubenswrapper[29612]: I0319 12:28:34.819941 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9f8cc63c-1080-41d8-b789-bce87f57e603" (UID: "9f8cc63c-1080-41d8-b789-bce87f57e603"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:34.838038 master-0 kubenswrapper[29612]: I0319 12:28:34.837832 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f8cc63c-1080-41d8-b789-bce87f57e603" (UID: "9f8cc63c-1080-41d8-b789-bce87f57e603"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:34.901152 master-0 kubenswrapper[29612]: I0319 12:28:34.901045 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:34.901409 master-0 kubenswrapper[29612]: I0319 12:28:34.901361 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkzgh\" (UniqueName: \"kubernetes.io/projected/9f8cc63c-1080-41d8-b789-bce87f57e603-kube-api-access-mkzgh\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:34.901409 master-0 kubenswrapper[29612]: I0319 12:28:34.901380 29612 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:34.901409 master-0 kubenswrapper[29612]: I0319 12:28:34.901391 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:34.927662 master-0 kubenswrapper[29612]: I0319 12:28:34.923753 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-config-data" (OuterVolumeSpecName: "config-data") pod "9f8cc63c-1080-41d8-b789-bce87f57e603" (UID: "9f8cc63c-1080-41d8-b789-bce87f57e603"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:34.957092 master-0 kubenswrapper[29612]: I0319 12:28:34.957029 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:35.004738 master-0 kubenswrapper[29612]: I0319 12:28:35.004672 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f8cc63c-1080-41d8-b789-bce87f57e603-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:35.059045 master-0 kubenswrapper[29612]: I0319 12:28:35.037994 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-w5dht" event={"ID":"6dd57bb3-eafc-41af-a7a0-794cd736adaa","Type":"ContainerStarted","Data":"d2fa3995bcd0e93d8313db25ff149ead5cd7e66149e130835cc71f793264989b"} Mar 19 12:28:35.104862 master-0 kubenswrapper[29612]: I0319 12:28:35.089109 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-db-sync-dx9n4" Mar 19 12:28:35.104862 master-0 kubenswrapper[29612]: I0319 12:28:35.090199 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-db-sync-dx9n4" event={"ID":"9f8cc63c-1080-41d8-b789-bce87f57e603","Type":"ContainerDied","Data":"2076c8a27cd33b2af478f1703569fb4e289f59855949358bc7ce5083ac7c457f"} Mar 19 12:28:35.104862 master-0 kubenswrapper[29612]: I0319 12:28:35.090229 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2076c8a27cd33b2af478f1703569fb4e289f59855949358bc7ce5083ac7c457f" Mar 19 12:28:35.296614 master-0 kubenswrapper[29612]: I0319 12:28:35.296554 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:35.296614 master-0 kubenswrapper[29612]: I0319 12:28:35.296622 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:35.365073 master-0 kubenswrapper[29612]: I0319 12:28:35.364575 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:35.365073 master-0 kubenswrapper[29612]: I0319 12:28:35.364923 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:36.103395 master-0 kubenswrapper[29612]: I0319 12:28:36.102857 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:36.104400 master-0 kubenswrapper[29612]: I0319 12:28:36.103869 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:36.226761 master-0 kubenswrapper[29612]: I0319 12:28:36.226398 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77f9c5fb97-m9qdj"] Mar 19 12:28:36.237685 master-0 kubenswrapper[29612]: W0319 12:28:36.234316 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dd11416_6337_4735_8eda_d42a42a56c99.slice/crio-cfcb8328707ae8fa1be27ae997c845720cbb2bd9537ace8754289492a96d17ed WatchSource:0}: Error finding container cfcb8328707ae8fa1be27ae997c845720cbb2bd9537ace8754289492a96d17ed: Status 404 returned error can't find the container with id cfcb8328707ae8fa1be27ae997c845720cbb2bd9537ace8754289492a96d17ed Mar 19 12:28:36.244271 master-0 kubenswrapper[29612]: I0319 12:28:36.243142 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b858fd547-mflsp"] Mar 19 12:28:36.255671 master-0 kubenswrapper[29612]: I0319 12:28:36.253902 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-w5dht" podStartSLOduration=15.345310308 podStartE2EDuration="23.253878062s" podCreationTimestamp="2026-03-19 12:28:13 +0000 UTC" firstStartedPulling="2026-03-19 12:28:24.516357852 +0000 UTC m=+1111.830526873" lastFinishedPulling="2026-03-19 12:28:32.424925596 +0000 UTC m=+1119.739094627" observedRunningTime="2026-03-19 12:28:36.207260507 +0000 UTC m=+1123.521429558" watchObservedRunningTime="2026-03-19 12:28:36.253878062 +0000 UTC m=+1123.568047103" Mar 19 12:28:36.992278 master-0 kubenswrapper[29612]: I0319 12:28:36.992199 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b4b84764-w6p2r"] Mar 19 12:28:37.068687 master-0 kubenswrapper[29612]: I0319 12:28:37.064717 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:37.068687 master-0 kubenswrapper[29612]: E0319 12:28:37.065547 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8cc63c-1080-41d8-b789-bce87f57e603" containerName="cinder-562cb-db-sync" Mar 19 12:28:37.068687 master-0 kubenswrapper[29612]: I0319 12:28:37.065564 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8cc63c-1080-41d8-b789-bce87f57e603" containerName="cinder-562cb-db-sync" Mar 19 12:28:37.068687 master-0 kubenswrapper[29612]: I0319 12:28:37.065822 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8cc63c-1080-41d8-b789-bce87f57e603" containerName="cinder-562cb-db-sync" Mar 19 12:28:37.091678 master-0 kubenswrapper[29612]: I0319 12:28:37.087807 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.112665 master-0 kubenswrapper[29612]: I0319 12:28:37.111485 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:37.112665 master-0 kubenswrapper[29612]: I0319 12:28:37.112373 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-scheduler-config-data" Mar 19 12:28:37.112665 master-0 kubenswrapper[29612]: I0319 12:28:37.112681 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-config-data" Mar 19 12:28:37.113376 master-0 kubenswrapper[29612]: I0319 12:28:37.112838 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-scripts" Mar 19 12:28:37.205679 master-0 kubenswrapper[29612]: I0319 12:28:37.204253 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:37.205679 master-0 kubenswrapper[29612]: I0319 12:28:37.204450 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.213680 master-0 kubenswrapper[29612]: I0319 12:28:37.210333 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-backup-config-data" Mar 19 12:28:37.234198 master-0 kubenswrapper[29612]: I0319 12:28:37.232603 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:37.234550 master-0 kubenswrapper[29612]: I0319 12:28:37.234511 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.267176 master-0 kubenswrapper[29612]: I0319 12:28:37.265851 29612 generic.go:334] "Generic (PLEG): container finished" podID="4a48162e-0a2e-4b9d-b470-0346024a7c7e" containerID="a6d967c396df37f31cfa704e7b953ff9e77b225cfdd5ab723e79e59e9135272b" exitCode=0 Mar 19 12:28:37.267176 master-0 kubenswrapper[29612]: I0319 12:28:37.265958 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" event={"ID":"4a48162e-0a2e-4b9d-b470-0346024a7c7e","Type":"ContainerDied","Data":"a6d967c396df37f31cfa704e7b953ff9e77b225cfdd5ab723e79e59e9135272b"} Mar 19 12:28:37.267176 master-0 kubenswrapper[29612]: I0319 12:28:37.265989 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" event={"ID":"4a48162e-0a2e-4b9d-b470-0346024a7c7e","Type":"ContainerStarted","Data":"6ae78e2a2c77c6151af5e9175f0e388198ed302e9bdb4678364be87462882941"} Mar 19 12:28:37.311675 master-0 kubenswrapper[29612]: I0319 12:28:37.309719 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv66k\" (UniqueName: \"kubernetes.io/projected/925bacb7-bb6c-4920-9e82-e20e8fad09c6-kube-api-access-xv66k\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.311675 master-0 kubenswrapper[29612]: I0319 12:28:37.309898 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.311675 master-0 kubenswrapper[29612]: I0319 12:28:37.309931 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data-custom\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.311675 master-0 kubenswrapper[29612]: I0319 12:28:37.309966 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925bacb7-bb6c-4920-9e82-e20e8fad09c6-etc-machine-id\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.311675 master-0 kubenswrapper[29612]: I0319 12:28:37.310108 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-scripts\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.311675 master-0 kubenswrapper[29612]: I0319 12:28:37.310161 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-combined-ca-bundle\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.318686 master-0 kubenswrapper[29612]: I0319 12:28:37.317085 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:37.318686 master-0 kubenswrapper[29612]: I0319 12:28:37.317846 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-volume-lvm-iscsi-config-data" Mar 19 12:28:37.329675 master-0 kubenswrapper[29612]: I0319 12:28:37.327600 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b4b84764-w6p2r" event={"ID":"37c49776-7c55-42c9-9765-18af1ce57ce4","Type":"ContainerStarted","Data":"32c6bd9a55e502753c61c7a18babe7ad035c98b11835a9fa8cfe5f8fe5ae9610"} Mar 19 12:28:37.367666 master-0 kubenswrapper[29612]: I0319 12:28:37.358788 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b858fd547-mflsp" event={"ID":"1dd11416-6337-4735-8eda-d42a42a56c99","Type":"ContainerStarted","Data":"4b59376364f3c1693e205bef8b30ff24f98826caff7ce67575e2419f1138b88a"} Mar 19 12:28:37.367666 master-0 kubenswrapper[29612]: I0319 12:28:37.358850 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b858fd547-mflsp" event={"ID":"1dd11416-6337-4735-8eda-d42a42a56c99","Type":"ContainerStarted","Data":"cfcb8328707ae8fa1be27ae997c845720cbb2bd9537ace8754289492a96d17ed"} Mar 19 12:28:37.367666 master-0 kubenswrapper[29612]: I0319 12:28:37.359374 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:28:37.379668 master-0 kubenswrapper[29612]: I0319 12:28:37.377372 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:37.379668 master-0 kubenswrapper[29612]: I0319 12:28:37.377427 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:37.387661 master-0 kubenswrapper[29612]: I0319 12:28:37.383767 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.415767 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-combined-ca-bundle\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.415947 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data-custom\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416025 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-scripts\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416062 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416110 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-scripts\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416132 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-iscsi\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416221 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-iscsi\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416267 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-dev\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416296 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416348 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-run\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416368 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-run\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416408 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416436 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpxjg\" (UniqueName: \"kubernetes.io/projected/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-kube-api-access-gpxjg\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416459 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2w9x\" (UniqueName: \"kubernetes.io/projected/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-kube-api-access-p2w9x\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416519 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-machine-id\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416556 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv66k\" (UniqueName: \"kubernetes.io/projected/925bacb7-bb6c-4920-9e82-e20e8fad09c6-kube-api-access-xv66k\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416596 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-brick\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416698 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-sys\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416765 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416791 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-nvme\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416837 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data-custom\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416872 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-brick\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416921 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925bacb7-bb6c-4920-9e82-e20e8fad09c6-etc-machine-id\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.416959 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data-custom\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417017 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-scripts\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417075 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-lib-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417393 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-lib-modules\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417440 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-combined-ca-bundle\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417469 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-sys\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417486 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-lib-modules\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417533 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-combined-ca-bundle\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417563 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-lib-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417612 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-dev\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417652 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417678 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-nvme\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.417710 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-machine-id\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.420681 master-0 kubenswrapper[29612]: I0319 12:28:37.418307 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77bcfcb874-5q7wl"] Mar 19 12:28:37.463222 master-0 kubenswrapper[29612]: I0319 12:28:37.427851 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925bacb7-bb6c-4920-9e82-e20e8fad09c6-etc-machine-id\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.463222 master-0 kubenswrapper[29612]: I0319 12:28:37.428408 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data-custom\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.463222 master-0 kubenswrapper[29612]: I0319 12:28:37.431720 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-scripts\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.463222 master-0 kubenswrapper[29612]: I0319 12:28:37.450796 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-combined-ca-bundle\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.511761 master-0 kubenswrapper[29612]: I0319 12:28:37.510620 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.532049 master-0 kubenswrapper[29612]: I0319 12:28:37.531122 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538246 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-nvme\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538343 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-brick\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538399 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data-custom\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538467 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-lib-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538507 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-lib-modules\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538539 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-combined-ca-bundle\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538573 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-sys\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538595 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-lib-modules\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538622 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-lib-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538664 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-dev\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538683 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538699 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-nvme\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538739 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-machine-id\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.538760 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-combined-ca-bundle\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539006 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data-custom\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539133 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-scripts\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539224 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539256 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-scripts\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539280 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-iscsi\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539350 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-iscsi\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539383 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-dev\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539408 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-nvme\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539522 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539576 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-run\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.539725 master-0 kubenswrapper[29612]: I0319 12:28:37.539612 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-run\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.540758 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.540794 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpxjg\" (UniqueName: \"kubernetes.io/projected/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-kube-api-access-gpxjg\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.540834 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2w9x\" (UniqueName: \"kubernetes.io/projected/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-kube-api-access-p2w9x\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.540879 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-machine-id\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.540956 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-brick\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.541011 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-sys\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.547398 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-dev\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.548775 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-iscsi\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.549272 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-lib-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.549309 master-0 kubenswrapper[29612]: I0319 12:28:37.549315 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-lib-modules\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.549791 master-0 kubenswrapper[29612]: I0319 12:28:37.549462 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-brick\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.551232 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-machine-id\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.551315 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-dev\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.551572 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-run\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.553456 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.553492 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-run\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.553890 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-brick\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.553925 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-sys\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.556250 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-nvme\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.558306 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-sys\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.560559 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-lib-modules\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.560612 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-lib-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.560718 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.562219 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-iscsi\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.562315 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-combined-ca-bundle\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.565696 master-0 kubenswrapper[29612]: I0319 12:28:37.562798 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-machine-id\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.573731 master-0 kubenswrapper[29612]: I0319 12:28:37.570194 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv66k\" (UniqueName: \"kubernetes.io/projected/925bacb7-bb6c-4920-9e82-e20e8fad09c6-kube-api-access-xv66k\") pod \"cinder-562cb-scheduler-0\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.577288 master-0 kubenswrapper[29612]: I0319 12:28:37.574681 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data-custom\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.589685 master-0 kubenswrapper[29612]: I0319 12:28:37.577912 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.596691 master-0 kubenswrapper[29612]: I0319 12:28:37.594920 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.596691 master-0 kubenswrapper[29612]: I0319 12:28:37.595025 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-scripts\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.607696 master-0 kubenswrapper[29612]: I0319 12:28:37.603697 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f9c5fb97-m9qdj"] Mar 19 12:28:37.607696 master-0 kubenswrapper[29612]: I0319 12:28:37.605968 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-combined-ca-bundle\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.688720 master-0 kubenswrapper[29612]: I0319 12:28:37.685719 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data-custom\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.688720 master-0 kubenswrapper[29612]: I0319 12:28:37.685921 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-scripts\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.688720 master-0 kubenswrapper[29612]: I0319 12:28:37.687414 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:37.692672 master-0 kubenswrapper[29612]: I0319 12:28:37.692198 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2w9x\" (UniqueName: \"kubernetes.io/projected/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-kube-api-access-p2w9x\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.699671 master-0 kubenswrapper[29612]: I0319 12:28:37.696542 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpxjg\" (UniqueName: \"kubernetes.io/projected/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-kube-api-access-gpxjg\") pod \"cinder-562cb-backup-0\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.707669 master-0 kubenswrapper[29612]: I0319 12:28:37.705813 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:37.751665 master-0 kubenswrapper[29612]: I0319 12:28:37.738705 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78975f67c-cdl4g"] Mar 19 12:28:37.751665 master-0 kubenswrapper[29612]: I0319 12:28:37.740682 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.777324 master-0 kubenswrapper[29612]: I0319 12:28:37.775152 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78975f67c-cdl4g"] Mar 19 12:28:37.823668 master-0 kubenswrapper[29612]: I0319 12:28:37.820706 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b858fd547-mflsp" podStartSLOduration=4.820685046 podStartE2EDuration="4.820685046s" podCreationTimestamp="2026-03-19 12:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:37.69555248 +0000 UTC m=+1125.009721501" watchObservedRunningTime="2026-03-19 12:28:37.820685046 +0000 UTC m=+1125.134854067" Mar 19 12:28:37.837690 master-0 kubenswrapper[29612]: I0319 12:28:37.837074 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:37.843899 master-0 kubenswrapper[29612]: I0319 12:28:37.839591 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.847675 master-0 kubenswrapper[29612]: I0319 12:28:37.847174 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-api-config-data" Mar 19 12:28:37.856324 master-0 kubenswrapper[29612]: I0319 12:28:37.853926 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:37.860714 master-0 kubenswrapper[29612]: I0319 12:28:37.859284 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:37.860714 master-0 kubenswrapper[29612]: I0319 12:28:37.860558 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gc9m\" (UniqueName: \"kubernetes.io/projected/5681835f-5e92-40aa-a647-62b3914261ed-kube-api-access-4gc9m\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.860714 master-0 kubenswrapper[29612]: I0319 12:28:37.860646 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-sb\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.860951 master-0 kubenswrapper[29612]: I0319 12:28:37.860764 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-nb\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.860951 master-0 kubenswrapper[29612]: I0319 12:28:37.860819 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-swift-storage-0\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.860951 master-0 kubenswrapper[29612]: I0319 12:28:37.860843 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-svc\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.860951 master-0 kubenswrapper[29612]: I0319 12:28:37.860870 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-config\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.938667 master-0 kubenswrapper[29612]: I0319 12:28:37.935977 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.979803 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.980871 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26343517-cc71-4e1b-822b-550f6962d65f-etc-machine-id\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.980902 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-combined-ca-bundle\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.980923 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-nb\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.983625 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-nb\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.983697 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-scripts\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.983807 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-swift-storage-0\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.983826 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data-custom\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.983865 master-0 kubenswrapper[29612]: I0319 12:28:37.983860 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-svc\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.984732 master-0 kubenswrapper[29612]: I0319 12:28:37.983915 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-config\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.984732 master-0 kubenswrapper[29612]: I0319 12:28:37.984072 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gc9m\" (UniqueName: \"kubernetes.io/projected/5681835f-5e92-40aa-a647-62b3914261ed-kube-api-access-4gc9m\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.984732 master-0 kubenswrapper[29612]: I0319 12:28:37.984434 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-sb\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.984732 master-0 kubenswrapper[29612]: I0319 12:28:37.984464 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vhlh\" (UniqueName: \"kubernetes.io/projected/26343517-cc71-4e1b-822b-550f6962d65f-kube-api-access-6vhlh\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.984732 master-0 kubenswrapper[29612]: I0319 12:28:37.984525 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26343517-cc71-4e1b-822b-550f6962d65f-logs\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:37.992029 master-0 kubenswrapper[29612]: I0319 12:28:37.986256 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-svc\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.992029 master-0 kubenswrapper[29612]: I0319 12:28:37.987202 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-config\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:37.992029 master-0 kubenswrapper[29612]: I0319 12:28:37.987920 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-sb\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:38.001997 master-0 kubenswrapper[29612]: I0319 12:28:38.001833 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-swift-storage-0\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:38.019055 master-0 kubenswrapper[29612]: I0319 12:28:38.018550 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gc9m\" (UniqueName: \"kubernetes.io/projected/5681835f-5e92-40aa-a647-62b3914261ed-kube-api-access-4gc9m\") pod \"dnsmasq-dns-78975f67c-cdl4g\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092388 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vhlh\" (UniqueName: \"kubernetes.io/projected/26343517-cc71-4e1b-822b-550f6962d65f-kube-api-access-6vhlh\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092452 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26343517-cc71-4e1b-822b-550f6962d65f-logs\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092491 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092539 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26343517-cc71-4e1b-822b-550f6962d65f-etc-machine-id\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092558 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-combined-ca-bundle\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092618 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-scripts\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.096780 master-0 kubenswrapper[29612]: I0319 12:28:38.092679 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data-custom\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.107143 master-0 kubenswrapper[29612]: I0319 12:28:38.100962 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26343517-cc71-4e1b-822b-550f6962d65f-logs\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.107143 master-0 kubenswrapper[29612]: I0319 12:28:38.100980 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data-custom\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.107143 master-0 kubenswrapper[29612]: I0319 12:28:38.101044 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26343517-cc71-4e1b-822b-550f6962d65f-etc-machine-id\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.107315 master-0 kubenswrapper[29612]: I0319 12:28:38.107234 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-combined-ca-bundle\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.116781 master-0 kubenswrapper[29612]: I0319 12:28:38.108508 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.116781 master-0 kubenswrapper[29612]: I0319 12:28:38.109020 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-scripts\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.123760 master-0 kubenswrapper[29612]: I0319 12:28:38.118261 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:38.153958 master-0 kubenswrapper[29612]: E0319 12:28:38.153641 29612 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 19 12:28:38.153958 master-0 kubenswrapper[29612]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4a48162e-0a2e-4b9d-b470-0346024a7c7e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 12:28:38.153958 master-0 kubenswrapper[29612]: > podSandboxID="6ae78e2a2c77c6151af5e9175f0e388198ed302e9bdb4678364be87462882941" Mar 19 12:28:38.153958 master-0 kubenswrapper[29612]: E0319 12:28:38.153794 29612 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 12:28:38.153958 master-0 kubenswrapper[29612]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n695h65h5f6h679h7bh56dh64bh65ch68h76h95h55fh675h6dh5c5h57dh5cbhf4hf9hb6h5d6h696h5f4hf8h5c8h78hb7h689hf5h5ddh56fhd8q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58q8t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77f9c5fb97-m9qdj_openstack(4a48162e-0a2e-4b9d-b470-0346024a7c7e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4a48162e-0a2e-4b9d-b470-0346024a7c7e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 19 12:28:38.153958 master-0 kubenswrapper[29612]: > logger="UnhandledError" Mar 19 12:28:38.156670 master-0 kubenswrapper[29612]: E0319 12:28:38.155215 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4a48162e-0a2e-4b9d-b470-0346024a7c7e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" podUID="4a48162e-0a2e-4b9d-b470-0346024a7c7e" Mar 19 12:28:38.276985 master-0 kubenswrapper[29612]: I0319 12:28:38.274908 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vhlh\" (UniqueName: \"kubernetes.io/projected/26343517-cc71-4e1b-822b-550f6962d65f-kube-api-access-6vhlh\") pod \"cinder-562cb-api-0\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.505292 master-0 kubenswrapper[29612]: E0319 12:28:38.504970 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a48162e_0a2e_4b9d_b470_0346024a7c7e.slice/crio-conmon-d24bb6bdb449c6985faf2130fc1ef9fbcf17c5cdcbbd3c6d25074755d1ef81c9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a48162e_0a2e_4b9d_b470_0346024a7c7e.slice/crio-d24bb6bdb449c6985faf2130fc1ef9fbcf17c5cdcbbd3c6d25074755d1ef81c9.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:28:38.525688 master-0 kubenswrapper[29612]: I0319 12:28:38.508358 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:38.535476 master-0 kubenswrapper[29612]: I0319 12:28:38.535347 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b4b84764-w6p2r" event={"ID":"37c49776-7c55-42c9-9765-18af1ce57ce4","Type":"ContainerStarted","Data":"2231b252621b7676769dc4e9ccb64339a152dc5298136911320816ff7ed5d573"} Mar 19 12:28:38.547701 master-0 kubenswrapper[29612]: I0319 12:28:38.542800 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:28:38.547701 master-0 kubenswrapper[29612]: I0319 12:28:38.542844 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:28:38.547701 master-0 kubenswrapper[29612]: I0319 12:28:38.543984 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bcfcb874-5q7wl" event={"ID":"966779b3-45a4-4c95-8dfb-5e2b42f42cc1","Type":"ContainerStarted","Data":"8589e552d744763a24469e1f2016f838bf656fe359ba14b852b568a5e90b6304"} Mar 19 12:28:38.547701 master-0 kubenswrapper[29612]: I0319 12:28:38.544031 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:38.547701 master-0 kubenswrapper[29612]: I0319 12:28:38.544048 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:38.868272 master-0 kubenswrapper[29612]: I0319 12:28:38.867889 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:38.984029 master-0 kubenswrapper[29612]: I0319 12:28:38.983780 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:39.320451 master-0 kubenswrapper[29612]: I0319 12:28:39.320235 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78975f67c-cdl4g"] Mar 19 12:28:39.516158 master-0 kubenswrapper[29612]: I0319 12:28:39.516114 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:39.589651 master-0 kubenswrapper[29612]: I0319 12:28:39.584325 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" event={"ID":"5681835f-5e92-40aa-a647-62b3914261ed","Type":"ContainerStarted","Data":"9624563e3e9508146f8a6613394581e0bac266bec8960f3ed41d3079b3ab073c"} Mar 19 12:28:39.613655 master-0 kubenswrapper[29612]: I0319 12:28:39.606869 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" event={"ID":"4a48162e-0a2e-4b9d-b470-0346024a7c7e","Type":"ContainerDied","Data":"6ae78e2a2c77c6151af5e9175f0e388198ed302e9bdb4678364be87462882941"} Mar 19 12:28:39.613655 master-0 kubenswrapper[29612]: I0319 12:28:39.606923 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ae78e2a2c77c6151af5e9175f0e388198ed302e9bdb4678364be87462882941" Mar 19 12:28:39.619716 master-0 kubenswrapper[29612]: I0319 12:28:39.619657 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"925bacb7-bb6c-4920-9e82-e20e8fad09c6","Type":"ContainerStarted","Data":"c4f2c7e78cd0dab4eb2dd3aa1a2e115fdab0c94e0c9537f158f5e154e86404b7"} Mar 19 12:28:39.624828 master-0 kubenswrapper[29612]: I0319 12:28:39.624798 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56","Type":"ContainerStarted","Data":"1a5955a31cfdf08b068c5b3a1e36fd8273a90cb35c85732e1021520569965c18"} Mar 19 12:28:39.626343 master-0 kubenswrapper[29612]: I0319 12:28:39.626326 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b4b84764-w6p2r" event={"ID":"37c49776-7c55-42c9-9765-18af1ce57ce4","Type":"ContainerStarted","Data":"e928c6e701b6e0763a24d43a54cfeafbc0cc4b69443a3947128265ae31ca4d04"} Mar 19 12:28:39.627427 master-0 kubenswrapper[29612]: I0319 12:28:39.627410 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:28:39.636737 master-0 kubenswrapper[29612]: I0319 12:28:39.631912 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:39.636737 master-0 kubenswrapper[29612]: I0319 12:28:39.635901 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"20ea2edf-d47b-48e9-b9de-8dac8f6b1599","Type":"ContainerStarted","Data":"af75810c18c1c62f1fd77ba1d08b8dc6af8703542da0051daa6ad4fd3685aecc"} Mar 19 12:28:39.636737 master-0 kubenswrapper[29612]: W0319 12:28:39.636584 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26343517_cc71_4e1b_822b_550f6962d65f.slice/crio-133d08a52898a24d7a8a07d50e6d66d8171fe8ccf29c7a6e5afaa2f38489805a WatchSource:0}: Error finding container 133d08a52898a24d7a8a07d50e6d66d8171fe8ccf29c7a6e5afaa2f38489805a: Status 404 returned error can't find the container with id 133d08a52898a24d7a8a07d50e6d66d8171fe8ccf29c7a6e5afaa2f38489805a Mar 19 12:28:39.662965 master-0 kubenswrapper[29612]: I0319 12:28:39.652851 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bcfcb874-5q7wl" event={"ID":"966779b3-45a4-4c95-8dfb-5e2b42f42cc1","Type":"ContainerStarted","Data":"4902263a87f1037c2d5f01561807c0efbf819f76dac185c72d8320772c9ec2a0"} Mar 19 12:28:39.662965 master-0 kubenswrapper[29612]: I0319 12:28:39.652891 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bcfcb874-5q7wl" event={"ID":"966779b3-45a4-4c95-8dfb-5e2b42f42cc1","Type":"ContainerStarted","Data":"e49fb06ec118d446a356cd045b9abb7f736c5d0b5fb9674ec53fc16be70f30d4"} Mar 19 12:28:39.662965 master-0 kubenswrapper[29612]: I0319 12:28:39.652910 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:28:39.692330 master-0 kubenswrapper[29612]: I0319 12:28:39.686811 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b4b84764-w6p2r" podStartSLOduration=5.686788304 podStartE2EDuration="5.686788304s" podCreationTimestamp="2026-03-19 12:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:39.663991106 +0000 UTC m=+1126.978160127" watchObservedRunningTime="2026-03-19 12:28:39.686788304 +0000 UTC m=+1127.000957325" Mar 19 12:28:39.738426 master-0 kubenswrapper[29612]: I0319 12:28:39.738368 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:39.749397 master-0 kubenswrapper[29612]: I0319 12:28:39.748083 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77bcfcb874-5q7wl" podStartSLOduration=5.748045434 podStartE2EDuration="5.748045434s" podCreationTimestamp="2026-03-19 12:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:39.724536967 +0000 UTC m=+1127.038706008" watchObservedRunningTime="2026-03-19 12:28:39.748045434 +0000 UTC m=+1127.062214455" Mar 19 12:28:39.835453 master-0 kubenswrapper[29612]: I0319 12:28:39.835399 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-swift-storage-0\") pod \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " Mar 19 12:28:39.835786 master-0 kubenswrapper[29612]: I0319 12:28:39.835580 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58q8t\" (UniqueName: \"kubernetes.io/projected/4a48162e-0a2e-4b9d-b470-0346024a7c7e-kube-api-access-58q8t\") pod \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " Mar 19 12:28:39.835786 master-0 kubenswrapper[29612]: I0319 12:28:39.835621 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-sb\") pod \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " Mar 19 12:28:39.835786 master-0 kubenswrapper[29612]: I0319 12:28:39.835691 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-svc\") pod \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " Mar 19 12:28:39.835786 master-0 kubenswrapper[29612]: I0319 12:28:39.835717 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-nb\") pod \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " Mar 19 12:28:39.835942 master-0 kubenswrapper[29612]: I0319 12:28:39.835908 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-config\") pod \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\" (UID: \"4a48162e-0a2e-4b9d-b470-0346024a7c7e\") " Mar 19 12:28:39.839534 master-0 kubenswrapper[29612]: I0319 12:28:39.839452 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a48162e-0a2e-4b9d-b470-0346024a7c7e-kube-api-access-58q8t" (OuterVolumeSpecName: "kube-api-access-58q8t") pod "4a48162e-0a2e-4b9d-b470-0346024a7c7e" (UID: "4a48162e-0a2e-4b9d-b470-0346024a7c7e"). InnerVolumeSpecName "kube-api-access-58q8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:39.942014 master-0 kubenswrapper[29612]: I0319 12:28:39.941845 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58q8t\" (UniqueName: \"kubernetes.io/projected/4a48162e-0a2e-4b9d-b470-0346024a7c7e-kube-api-access-58q8t\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:39.959563 master-0 kubenswrapper[29612]: I0319 12:28:39.959495 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a48162e-0a2e-4b9d-b470-0346024a7c7e" (UID: "4a48162e-0a2e-4b9d-b470-0346024a7c7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:39.961102 master-0 kubenswrapper[29612]: I0319 12:28:39.961036 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a48162e-0a2e-4b9d-b470-0346024a7c7e" (UID: "4a48162e-0a2e-4b9d-b470-0346024a7c7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:39.984011 master-0 kubenswrapper[29612]: I0319 12:28:39.983930 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-config" (OuterVolumeSpecName: "config") pod "4a48162e-0a2e-4b9d-b470-0346024a7c7e" (UID: "4a48162e-0a2e-4b9d-b470-0346024a7c7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:39.984223 master-0 kubenswrapper[29612]: I0319 12:28:39.984024 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a48162e-0a2e-4b9d-b470-0346024a7c7e" (UID: "4a48162e-0a2e-4b9d-b470-0346024a7c7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:40.003279 master-0 kubenswrapper[29612]: I0319 12:28:40.003202 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a48162e-0a2e-4b9d-b470-0346024a7c7e" (UID: "4a48162e-0a2e-4b9d-b470-0346024a7c7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:40.065996 master-0 kubenswrapper[29612]: I0319 12:28:40.065849 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:40.065996 master-0 kubenswrapper[29612]: I0319 12:28:40.065919 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:40.065996 master-0 kubenswrapper[29612]: I0319 12:28:40.065961 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:40.065996 master-0 kubenswrapper[29612]: I0319 12:28:40.065974 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:40.065996 master-0 kubenswrapper[29612]: I0319 12:28:40.065986 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a48162e-0a2e-4b9d-b470-0346024a7c7e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:40.553838 master-0 kubenswrapper[29612]: I0319 12:28:40.553746 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:40.702260 master-0 kubenswrapper[29612]: I0319 12:28:40.701287 29612 generic.go:334] "Generic (PLEG): container finished" podID="5681835f-5e92-40aa-a647-62b3914261ed" containerID="3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626" exitCode=0 Mar 19 12:28:40.702260 master-0 kubenswrapper[29612]: I0319 12:28:40.701417 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" event={"ID":"5681835f-5e92-40aa-a647-62b3914261ed","Type":"ContainerDied","Data":"3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626"} Mar 19 12:28:40.714461 master-0 kubenswrapper[29612]: I0319 12:28:40.714413 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"26343517-cc71-4e1b-822b-550f6962d65f","Type":"ContainerStarted","Data":"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548"} Mar 19 12:28:40.714461 master-0 kubenswrapper[29612]: I0319 12:28:40.714460 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"26343517-cc71-4e1b-822b-550f6962d65f","Type":"ContainerStarted","Data":"133d08a52898a24d7a8a07d50e6d66d8171fe8ccf29c7a6e5afaa2f38489805a"} Mar 19 12:28:40.714738 master-0 kubenswrapper[29612]: I0319 12:28:40.714505 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77f9c5fb97-m9qdj" Mar 19 12:28:40.716869 master-0 kubenswrapper[29612]: I0319 12:28:40.716823 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:28:40.716869 master-0 kubenswrapper[29612]: I0319 12:28:40.716856 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:28:41.008193 master-0 kubenswrapper[29612]: I0319 12:28:41.007413 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77f9c5fb97-m9qdj"] Mar 19 12:28:41.016478 master-0 kubenswrapper[29612]: I0319 12:28:41.015775 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77f9c5fb97-m9qdj"] Mar 19 12:28:41.520192 master-0 kubenswrapper[29612]: I0319 12:28:41.512361 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:41.520192 master-0 kubenswrapper[29612]: I0319 12:28:41.512500 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:28:41.520192 master-0 kubenswrapper[29612]: I0319 12:28:41.513981 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:28:41.793011 master-0 kubenswrapper[29612]: I0319 12:28:41.792957 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" event={"ID":"5681835f-5e92-40aa-a647-62b3914261ed","Type":"ContainerStarted","Data":"1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba"} Mar 19 12:28:41.794293 master-0 kubenswrapper[29612]: I0319 12:28:41.794244 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:41.820409 master-0 kubenswrapper[29612]: I0319 12:28:41.820338 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"925bacb7-bb6c-4920-9e82-e20e8fad09c6","Type":"ContainerStarted","Data":"dce1b5eed86fe28ef448c5369a1ae7df25615dd2bda48a9ca8ef661f54146028"} Mar 19 12:28:41.846526 master-0 kubenswrapper[29612]: I0319 12:28:41.846457 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56","Type":"ContainerStarted","Data":"7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53"} Mar 19 12:28:41.909855 master-0 kubenswrapper[29612]: I0319 12:28:41.909785 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" podStartSLOduration=4.909768134 podStartE2EDuration="4.909768134s" podCreationTimestamp="2026-03-19 12:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:41.896206138 +0000 UTC m=+1129.210375159" watchObservedRunningTime="2026-03-19 12:28:41.909768134 +0000 UTC m=+1129.223937175" Mar 19 12:28:42.035774 master-0 kubenswrapper[29612]: I0319 12:28:42.035675 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-backup-0" podStartSLOduration=4.938446742 podStartE2EDuration="6.03561581s" podCreationTimestamp="2026-03-19 12:28:36 +0000 UTC" firstStartedPulling="2026-03-19 12:28:39.524961255 +0000 UTC m=+1126.839130276" lastFinishedPulling="2026-03-19 12:28:40.622130323 +0000 UTC m=+1127.936299344" observedRunningTime="2026-03-19 12:28:42.018349169 +0000 UTC m=+1129.332518220" watchObservedRunningTime="2026-03-19 12:28:42.03561581 +0000 UTC m=+1129.349784831" Mar 19 12:28:42.240997 master-0 kubenswrapper[29612]: I0319 12:28:42.240616 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:42.240997 master-0 kubenswrapper[29612]: I0319 12:28:42.240808 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:28:42.251054 master-0 kubenswrapper[29612]: I0319 12:28:42.250958 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:28:42.864801 master-0 kubenswrapper[29612]: I0319 12:28:42.864726 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"20ea2edf-d47b-48e9-b9de-8dac8f6b1599","Type":"ContainerStarted","Data":"68b0ac96d3681905a60fe830bce81a9cc181901125df19014603ca9e869985ca"} Mar 19 12:28:42.864801 master-0 kubenswrapper[29612]: I0319 12:28:42.864794 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"20ea2edf-d47b-48e9-b9de-8dac8f6b1599","Type":"ContainerStarted","Data":"dca28992236f4a2ff4ddf8a2b140853b13e1168d6ff1db8be0dc847b7cb49ea9"} Mar 19 12:28:42.867650 master-0 kubenswrapper[29612]: I0319 12:28:42.867574 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"925bacb7-bb6c-4920-9e82-e20e8fad09c6","Type":"ContainerStarted","Data":"6d53d02fe19f678d8261595e2ac1ecd5b4f56d980764d11759be33561b566cbc"} Mar 19 12:28:42.872622 master-0 kubenswrapper[29612]: I0319 12:28:42.872562 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56","Type":"ContainerStarted","Data":"dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776"} Mar 19 12:28:42.875310 master-0 kubenswrapper[29612]: I0319 12:28:42.875242 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-api-0" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-562cb-api-log" containerID="cri-o://9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548" gracePeriod=30 Mar 19 12:28:42.875581 master-0 kubenswrapper[29612]: I0319 12:28:42.875548 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"26343517-cc71-4e1b-822b-550f6962d65f","Type":"ContainerStarted","Data":"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c"} Mar 19 12:28:42.878174 master-0 kubenswrapper[29612]: I0319 12:28:42.878098 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:42.878305 master-0 kubenswrapper[29612]: I0319 12:28:42.878191 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-api-0" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-api" containerID="cri-o://ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c" gracePeriod=30 Mar 19 12:28:42.923062 master-0 kubenswrapper[29612]: I0319 12:28:42.922988 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a48162e-0a2e-4b9d-b470-0346024a7c7e" path="/var/lib/kubelet/pods/4a48162e-0a2e-4b9d-b470-0346024a7c7e/volumes" Mar 19 12:28:42.937727 master-0 kubenswrapper[29612]: I0319 12:28:42.937648 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:43.043742 master-0 kubenswrapper[29612]: I0319 12:28:43.043246 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" podStartSLOduration=3.854243019 podStartE2EDuration="6.043229272s" podCreationTimestamp="2026-03-19 12:28:37 +0000 UTC" firstStartedPulling="2026-03-19 12:28:39.011073893 +0000 UTC m=+1126.325242914" lastFinishedPulling="2026-03-19 12:28:41.200060136 +0000 UTC m=+1128.514229167" observedRunningTime="2026-03-19 12:28:43.029749829 +0000 UTC m=+1130.343918850" watchObservedRunningTime="2026-03-19 12:28:43.043229272 +0000 UTC m=+1130.357398293" Mar 19 12:28:43.085194 master-0 kubenswrapper[29612]: I0319 12:28:43.084722 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-scheduler-0" podStartSLOduration=5.7817005649999995 podStartE2EDuration="7.084707651s" podCreationTimestamp="2026-03-19 12:28:36 +0000 UTC" firstStartedPulling="2026-03-19 12:28:38.846202058 +0000 UTC m=+1126.160371079" lastFinishedPulling="2026-03-19 12:28:40.149209144 +0000 UTC m=+1127.463378165" observedRunningTime="2026-03-19 12:28:43.072977687 +0000 UTC m=+1130.387146708" watchObservedRunningTime="2026-03-19 12:28:43.084707651 +0000 UTC m=+1130.398876672" Mar 19 12:28:43.114195 master-0 kubenswrapper[29612]: I0319 12:28:43.114120 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-api-0" podStartSLOduration=6.114104076 podStartE2EDuration="6.114104076s" podCreationTimestamp="2026-03-19 12:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:43.110320009 +0000 UTC m=+1130.424489030" watchObservedRunningTime="2026-03-19 12:28:43.114104076 +0000 UTC m=+1130.428273097" Mar 19 12:28:43.324086 master-0 kubenswrapper[29612]: I0319 12:28:43.324033 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b4b84764-w6p2r"] Mar 19 12:28:43.324341 master-0 kubenswrapper[29612]: I0319 12:28:43.324312 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b4b84764-w6p2r" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-api" containerID="cri-o://2231b252621b7676769dc4e9ccb64339a152dc5298136911320816ff7ed5d573" gracePeriod=30 Mar 19 12:28:43.332798 master-0 kubenswrapper[29612]: I0319 12:28:43.327099 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b4b84764-w6p2r" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-httpd" containerID="cri-o://e928c6e701b6e0763a24d43a54cfeafbc0cc4b69443a3947128265ae31ca4d04" gracePeriod=30 Mar 19 12:28:43.360106 master-0 kubenswrapper[29612]: I0319 12:28:43.360043 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b4b84764-w6p2r" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 12:28:43.368335 master-0 kubenswrapper[29612]: I0319 12:28:43.365604 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5796698f9c-ltjks"] Mar 19 12:28:43.368335 master-0 kubenswrapper[29612]: E0319 12:28:43.366338 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a48162e-0a2e-4b9d-b470-0346024a7c7e" containerName="init" Mar 19 12:28:43.368335 master-0 kubenswrapper[29612]: I0319 12:28:43.366352 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a48162e-0a2e-4b9d-b470-0346024a7c7e" containerName="init" Mar 19 12:28:43.368335 master-0 kubenswrapper[29612]: I0319 12:28:43.366627 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a48162e-0a2e-4b9d-b470-0346024a7c7e" containerName="init" Mar 19 12:28:43.368335 master-0 kubenswrapper[29612]: I0319 12:28:43.368170 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.381546 master-0 kubenswrapper[29612]: I0319 12:28:43.374204 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 12:28:43.381546 master-0 kubenswrapper[29612]: I0319 12:28:43.374809 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 12:28:43.429838 master-0 kubenswrapper[29612]: I0319 12:28:43.429778 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5796698f9c-ltjks"] Mar 19 12:28:43.701487 master-0 kubenswrapper[29612]: I0319 12:28:43.701407 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-config\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.701918 master-0 kubenswrapper[29612]: I0319 12:28:43.701495 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzcbc\" (UniqueName: \"kubernetes.io/projected/d324f127-bb59-4845-b8ef-dbe97fdb263b-kube-api-access-zzcbc\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.701918 master-0 kubenswrapper[29612]: I0319 12:28:43.701552 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-internal-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.701918 master-0 kubenswrapper[29612]: I0319 12:28:43.701614 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-httpd-config\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.708212 master-0 kubenswrapper[29612]: I0319 12:28:43.708153 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-public-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.710616 master-0 kubenswrapper[29612]: I0319 12:28:43.708234 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-combined-ca-bundle\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.710616 master-0 kubenswrapper[29612]: I0319 12:28:43.708271 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-ovndb-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.812498 master-0 kubenswrapper[29612]: I0319 12:28:43.812245 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-config\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.812498 master-0 kubenswrapper[29612]: I0319 12:28:43.812387 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzcbc\" (UniqueName: \"kubernetes.io/projected/d324f127-bb59-4845-b8ef-dbe97fdb263b-kube-api-access-zzcbc\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.812498 master-0 kubenswrapper[29612]: I0319 12:28:43.812452 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-internal-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.814972 master-0 kubenswrapper[29612]: I0319 12:28:43.812519 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-httpd-config\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.814972 master-0 kubenswrapper[29612]: I0319 12:28:43.812624 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-public-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.814972 master-0 kubenswrapper[29612]: I0319 12:28:43.812676 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-combined-ca-bundle\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.814972 master-0 kubenswrapper[29612]: I0319 12:28:43.812699 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-ovndb-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.814972 master-0 kubenswrapper[29612]: I0319 12:28:43.813060 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:43.817243 master-0 kubenswrapper[29612]: I0319 12:28:43.817190 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-ovndb-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.820491 master-0 kubenswrapper[29612]: I0319 12:28:43.820461 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-public-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.822748 master-0 kubenswrapper[29612]: I0319 12:28:43.822686 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-httpd-config\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.825038 master-0 kubenswrapper[29612]: I0319 12:28:43.824448 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-combined-ca-bundle\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.825142 master-0 kubenswrapper[29612]: I0319 12:28:43.825091 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-internal-tls-certs\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.836676 master-0 kubenswrapper[29612]: I0319 12:28:43.836614 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzcbc\" (UniqueName: \"kubernetes.io/projected/d324f127-bb59-4845-b8ef-dbe97fdb263b-kube-api-access-zzcbc\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.844161 master-0 kubenswrapper[29612]: I0319 12:28:43.844113 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d324f127-bb59-4845-b8ef-dbe97fdb263b-config\") pod \"neutron-5796698f9c-ltjks\" (UID: \"d324f127-bb59-4845-b8ef-dbe97fdb263b\") " pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.914845 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vhlh\" (UniqueName: \"kubernetes.io/projected/26343517-cc71-4e1b-822b-550f6962d65f-kube-api-access-6vhlh\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.914920 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-scripts\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.915065 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26343517-cc71-4e1b-822b-550f6962d65f-logs\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.915090 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.915126 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data-custom\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.915162 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26343517-cc71-4e1b-822b-550f6962d65f-etc-machine-id\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.916738 master-0 kubenswrapper[29612]: I0319 12:28:43.915196 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-combined-ca-bundle\") pod \"26343517-cc71-4e1b-822b-550f6962d65f\" (UID: \"26343517-cc71-4e1b-822b-550f6962d65f\") " Mar 19 12:28:43.918132 master-0 kubenswrapper[29612]: I0319 12:28:43.917711 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26343517-cc71-4e1b-822b-550f6962d65f-logs" (OuterVolumeSpecName: "logs") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:43.919496 master-0 kubenswrapper[29612]: I0319 12:28:43.919462 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26343517-cc71-4e1b-822b-550f6962d65f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:43.923069 master-0 kubenswrapper[29612]: I0319 12:28:43.923014 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26343517-cc71-4e1b-822b-550f6962d65f-kube-api-access-6vhlh" (OuterVolumeSpecName: "kube-api-access-6vhlh") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "kube-api-access-6vhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:43.930093 master-0 kubenswrapper[29612]: I0319 12:28:43.930047 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-scripts" (OuterVolumeSpecName: "scripts") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:43.931932 master-0 kubenswrapper[29612]: I0319 12:28:43.931774 29612 generic.go:334] "Generic (PLEG): container finished" podID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerID="e928c6e701b6e0763a24d43a54cfeafbc0cc4b69443a3947128265ae31ca4d04" exitCode=0 Mar 19 12:28:43.931932 master-0 kubenswrapper[29612]: I0319 12:28:43.931895 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b4b84764-w6p2r" event={"ID":"37c49776-7c55-42c9-9765-18af1ce57ce4","Type":"ContainerDied","Data":"e928c6e701b6e0763a24d43a54cfeafbc0cc4b69443a3947128265ae31ca4d04"} Mar 19 12:28:43.935229 master-0 kubenswrapper[29612]: I0319 12:28:43.935152 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.956449 29612 generic.go:334] "Generic (PLEG): container finished" podID="26343517-cc71-4e1b-822b-550f6962d65f" containerID="ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c" exitCode=0 Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.956487 29612 generic.go:334] "Generic (PLEG): container finished" podID="26343517-cc71-4e1b-822b-550f6962d65f" containerID="9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548" exitCode=143 Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.957540 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.958141 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"26343517-cc71-4e1b-822b-550f6962d65f","Type":"ContainerDied","Data":"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c"} Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.958172 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"26343517-cc71-4e1b-822b-550f6962d65f","Type":"ContainerDied","Data":"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548"} Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.958183 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"26343517-cc71-4e1b-822b-550f6962d65f","Type":"ContainerDied","Data":"133d08a52898a24d7a8a07d50e6d66d8171fe8ccf29c7a6e5afaa2f38489805a"} Mar 19 12:28:43.959659 master-0 kubenswrapper[29612]: I0319 12:28:43.958200 29612 scope.go:117] "RemoveContainer" containerID="ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c" Mar 19 12:28:43.977395 master-0 kubenswrapper[29612]: I0319 12:28:43.975758 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:44.019610 master-0 kubenswrapper[29612]: I0319 12:28:44.019571 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.034310 master-0 kubenswrapper[29612]: I0319 12:28:44.034275 29612 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26343517-cc71-4e1b-822b-550f6962d65f-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.034624 master-0 kubenswrapper[29612]: I0319 12:28:44.034611 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.034722 master-0 kubenswrapper[29612]: I0319 12:28:44.034711 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vhlh\" (UniqueName: \"kubernetes.io/projected/26343517-cc71-4e1b-822b-550f6962d65f-kube-api-access-6vhlh\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.034793 master-0 kubenswrapper[29612]: I0319 12:28:44.034783 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.034877 master-0 kubenswrapper[29612]: I0319 12:28:44.034865 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26343517-cc71-4e1b-822b-550f6962d65f-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.065811 master-0 kubenswrapper[29612]: I0319 12:28:44.065740 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data" (OuterVolumeSpecName: "config-data") pod "26343517-cc71-4e1b-822b-550f6962d65f" (UID: "26343517-cc71-4e1b-822b-550f6962d65f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:44.068678 master-0 kubenswrapper[29612]: I0319 12:28:44.068200 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:44.140414 master-0 kubenswrapper[29612]: I0319 12:28:44.136998 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26343517-cc71-4e1b-822b-550f6962d65f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:44.183711 master-0 kubenswrapper[29612]: I0319 12:28:44.182660 29612 scope.go:117] "RemoveContainer" containerID="9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548" Mar 19 12:28:44.778240 master-0 kubenswrapper[29612]: I0319 12:28:44.776980 29612 scope.go:117] "RemoveContainer" containerID="ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c" Mar 19 12:28:44.780694 master-0 kubenswrapper[29612]: E0319 12:28:44.780487 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c\": container with ID starting with ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c not found: ID does not exist" containerID="ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c" Mar 19 12:28:44.780694 master-0 kubenswrapper[29612]: I0319 12:28:44.780542 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c"} err="failed to get container status \"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c\": rpc error: code = NotFound desc = could not find container \"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c\": container with ID starting with ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c not found: ID does not exist" Mar 19 12:28:44.780694 master-0 kubenswrapper[29612]: I0319 12:28:44.780572 29612 scope.go:117] "RemoveContainer" containerID="9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548" Mar 19 12:28:44.784210 master-0 kubenswrapper[29612]: E0319 12:28:44.784054 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548\": container with ID starting with 9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548 not found: ID does not exist" containerID="9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548" Mar 19 12:28:44.784210 master-0 kubenswrapper[29612]: I0319 12:28:44.784126 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548"} err="failed to get container status \"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548\": rpc error: code = NotFound desc = could not find container \"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548\": container with ID starting with 9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548 not found: ID does not exist" Mar 19 12:28:44.784210 master-0 kubenswrapper[29612]: I0319 12:28:44.784162 29612 scope.go:117] "RemoveContainer" containerID="ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c" Mar 19 12:28:44.789660 master-0 kubenswrapper[29612]: I0319 12:28:44.786797 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c"} err="failed to get container status \"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c\": rpc error: code = NotFound desc = could not find container \"ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c\": container with ID starting with ce64251a23d71a89bdd425fbf5df38b630472c9216e7ececb62d8f4989e53f7c not found: ID does not exist" Mar 19 12:28:44.789660 master-0 kubenswrapper[29612]: I0319 12:28:44.786833 29612 scope.go:117] "RemoveContainer" containerID="9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548" Mar 19 12:28:44.789846 master-0 kubenswrapper[29612]: I0319 12:28:44.789656 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548"} err="failed to get container status \"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548\": rpc error: code = NotFound desc = could not find container \"9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548\": container with ID starting with 9883feeeae56d250a957cce77075be8a784e6a671dd425b86d8f5f190c2d6548 not found: ID does not exist" Mar 19 12:28:44.798657 master-0 kubenswrapper[29612]: I0319 12:28:44.791097 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:44.816657 master-0 kubenswrapper[29612]: I0319 12:28:44.811143 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: I0319 12:28:44.878681 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: E0319 12:28:44.879194 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-api" Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: I0319 12:28:44.879209 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-api" Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: E0319 12:28:44.879266 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-562cb-api-log" Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: I0319 12:28:44.879277 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-562cb-api-log" Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: I0319 12:28:44.879696 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-api" Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: I0319 12:28:44.879726 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="26343517-cc71-4e1b-822b-550f6962d65f" containerName="cinder-562cb-api-log" Mar 19 12:28:44.893756 master-0 kubenswrapper[29612]: I0319 12:28:44.880946 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.900422 master-0 kubenswrapper[29612]: I0319 12:28:44.900382 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-api-config-data" Mar 19 12:28:44.900829 master-0 kubenswrapper[29612]: I0319 12:28:44.900810 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 12:28:44.901072 master-0 kubenswrapper[29612]: I0319 12:28:44.901058 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.961927 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-config-data-custom\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.961993 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-config-data\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962245 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-public-tls-certs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962383 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-etc-machine-id\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962482 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-logs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962582 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-scripts\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962718 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-combined-ca-bundle\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962814 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-internal-tls-certs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:44.963752 master-0 kubenswrapper[29612]: I0319 12:28:44.962878 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmxq5\" (UniqueName: \"kubernetes.io/projected/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-kube-api-access-lmxq5\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.013173 master-0 kubenswrapper[29612]: I0319 12:28:45.008604 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26343517-cc71-4e1b-822b-550f6962d65f" path="/var/lib/kubelet/pods/26343517-cc71-4e1b-822b-550f6962d65f/volumes" Mar 19 12:28:45.013173 master-0 kubenswrapper[29612]: I0319 12:28:45.009284 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:45.013173 master-0 kubenswrapper[29612]: I0319 12:28:45.009307 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5796698f9c-ltjks"] Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.064739 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmxq5\" (UniqueName: \"kubernetes.io/projected/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-kube-api-access-lmxq5\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.064837 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-config-data-custom\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.064872 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-config-data\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.064981 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-public-tls-certs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.065046 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-etc-machine-id\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.065097 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-logs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.065158 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-scripts\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.065216 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-combined-ca-bundle\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.066604 master-0 kubenswrapper[29612]: I0319 12:28:45.065268 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-internal-tls-certs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.068239 master-0 kubenswrapper[29612]: I0319 12:28:45.067182 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-etc-machine-id\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.068239 master-0 kubenswrapper[29612]: I0319 12:28:45.067564 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-logs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.070929 master-0 kubenswrapper[29612]: I0319 12:28:45.069187 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-public-tls-certs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.081675 master-0 kubenswrapper[29612]: I0319 12:28:45.077409 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-internal-tls-certs\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.081675 master-0 kubenswrapper[29612]: I0319 12:28:45.078317 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-combined-ca-bundle\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.081920 master-0 kubenswrapper[29612]: I0319 12:28:45.081706 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-config-data\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.089662 master-0 kubenswrapper[29612]: I0319 12:28:45.088409 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-config-data-custom\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.095672 master-0 kubenswrapper[29612]: I0319 12:28:45.093219 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-scripts\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.111662 master-0 kubenswrapper[29612]: I0319 12:28:45.107436 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmxq5\" (UniqueName: \"kubernetes.io/projected/1f2cae1d-a800-4f98-b302-2f315d4c5dc3-kube-api-access-lmxq5\") pod \"cinder-562cb-api-0\" (UID: \"1f2cae1d-a800-4f98-b302-2f315d4c5dc3\") " pod="openstack/cinder-562cb-api-0" Mar 19 12:28:45.309659 master-0 kubenswrapper[29612]: I0319 12:28:45.309494 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:46.018022 master-0 kubenswrapper[29612]: I0319 12:28:46.017952 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5796698f9c-ltjks" event={"ID":"d324f127-bb59-4845-b8ef-dbe97fdb263b","Type":"ContainerStarted","Data":"f7056e4d9274f50216c3753893ff3ecca26d01127ab9a51ea8e2448137770081"} Mar 19 12:28:46.018022 master-0 kubenswrapper[29612]: I0319 12:28:46.018022 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5796698f9c-ltjks" event={"ID":"d324f127-bb59-4845-b8ef-dbe97fdb263b","Type":"ContainerStarted","Data":"e885485392a20f33282a48f078d5be045d778c69b68b81af9826671e54794e9c"} Mar 19 12:28:46.018022 master-0 kubenswrapper[29612]: I0319 12:28:46.018039 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5796698f9c-ltjks" event={"ID":"d324f127-bb59-4845-b8ef-dbe97fdb263b","Type":"ContainerStarted","Data":"a99b3462941a9ca875396c5a365f55c8a9c271c94926e20228d3073af3a53c0b"} Mar 19 12:28:46.019719 master-0 kubenswrapper[29612]: I0319 12:28:46.019665 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:28:46.060656 master-0 kubenswrapper[29612]: I0319 12:28:46.047321 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5796698f9c-ltjks" podStartSLOduration=3.047304779 podStartE2EDuration="3.047304779s" podCreationTimestamp="2026-03-19 12:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:46.046762673 +0000 UTC m=+1133.360931704" watchObservedRunningTime="2026-03-19 12:28:46.047304779 +0000 UTC m=+1133.361473790" Mar 19 12:28:46.140005 master-0 kubenswrapper[29612]: I0319 12:28:46.134324 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-api-0"] Mar 19 12:28:47.044383 master-0 kubenswrapper[29612]: I0319 12:28:47.043106 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"1f2cae1d-a800-4f98-b302-2f315d4c5dc3","Type":"ContainerStarted","Data":"0efe905a7d69d432532d540a96ed3509941ab564b8936636cac8a1372412e023"} Mar 19 12:28:47.044383 master-0 kubenswrapper[29612]: I0319 12:28:47.043164 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"1f2cae1d-a800-4f98-b302-2f315d4c5dc3","Type":"ContainerStarted","Data":"56ce87cc6f1e4975eefd18c23e5e8d1fe913a1d17c86632b7f8a51b1a87918aa"} Mar 19 12:28:47.709916 master-0 kubenswrapper[29612]: I0319 12:28:47.707355 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:47.856072 master-0 kubenswrapper[29612]: I0319 12:28:47.854815 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:48.070243 master-0 kubenswrapper[29612]: I0319 12:28:48.070174 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-api-0" event={"ID":"1f2cae1d-a800-4f98-b302-2f315d4c5dc3","Type":"ContainerStarted","Data":"39317fe4ac9348f2f76e9aaaa40ab8d47064f9202d33eeb6cb19ca81715b55d9"} Mar 19 12:28:48.072053 master-0 kubenswrapper[29612]: I0319 12:28:48.071418 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:48.115181 master-0 kubenswrapper[29612]: I0319 12:28:48.114938 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:48.117549 master-0 kubenswrapper[29612]: I0319 12:28:48.117423 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-api-0" podStartSLOduration=4.117396143 podStartE2EDuration="4.117396143s" podCreationTimestamp="2026-03-19 12:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:48.114937153 +0000 UTC m=+1135.429106174" watchObservedRunningTime="2026-03-19 12:28:48.117396143 +0000 UTC m=+1135.431565184" Mar 19 12:28:48.123964 master-0 kubenswrapper[29612]: I0319 12:28:48.123908 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:28:48.313851 master-0 kubenswrapper[29612]: I0319 12:28:48.313701 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:48.559302 master-0 kubenswrapper[29612]: I0319 12:28:48.555045 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:48.662660 master-0 kubenswrapper[29612]: I0319 12:28:48.657734 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648f9ff569-fh2n6"] Mar 19 12:28:48.662660 master-0 kubenswrapper[29612]: I0319 12:28:48.658149 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerName="dnsmasq-dns" containerID="cri-o://4740809d17c3266968f5aeea0d8e57ca84ea0b54a6e25301d1bab684a7c607e5" gracePeriod=10 Mar 19 12:28:48.733253 master-0 kubenswrapper[29612]: I0319 12:28:48.723297 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:48.829666 master-0 kubenswrapper[29612]: I0319 12:28:48.828686 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:48.988226 master-0 kubenswrapper[29612]: I0319 12:28:48.987624 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:49.095454 master-0 kubenswrapper[29612]: I0319 12:28:49.095411 29612 generic.go:334] "Generic (PLEG): container finished" podID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerID="4740809d17c3266968f5aeea0d8e57ca84ea0b54a6e25301d1bab684a7c607e5" exitCode=0 Mar 19 12:28:49.096354 master-0 kubenswrapper[29612]: I0319 12:28:49.095489 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" event={"ID":"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a","Type":"ContainerDied","Data":"4740809d17c3266968f5aeea0d8e57ca84ea0b54a6e25301d1bab684a7c607e5"} Mar 19 12:28:49.096815 master-0 kubenswrapper[29612]: I0319 12:28:49.096688 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-scheduler-0" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="probe" containerID="cri-o://6d53d02fe19f678d8261595e2ac1ecd5b4f56d980764d11759be33561b566cbc" gracePeriod=30 Mar 19 12:28:49.096970 master-0 kubenswrapper[29612]: I0319 12:28:49.096511 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-scheduler-0" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="cinder-scheduler" containerID="cri-o://dce1b5eed86fe28ef448c5369a1ae7df25615dd2bda48a9ca8ef661f54146028" gracePeriod=30 Mar 19 12:28:49.100257 master-0 kubenswrapper[29612]: I0319 12:28:49.097381 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-backup-0" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="cinder-backup" containerID="cri-o://7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53" gracePeriod=30 Mar 19 12:28:49.100257 master-0 kubenswrapper[29612]: I0319 12:28:49.097756 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="cinder-volume" containerID="cri-o://dca28992236f4a2ff4ddf8a2b140853b13e1168d6ff1db8be0dc847b7cb49ea9" gracePeriod=30 Mar 19 12:28:49.100257 master-0 kubenswrapper[29612]: I0319 12:28:49.097865 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-backup-0" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="probe" containerID="cri-o://dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776" gracePeriod=30 Mar 19 12:28:49.100257 master-0 kubenswrapper[29612]: I0319 12:28:49.097952 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="probe" containerID="cri-o://68b0ac96d3681905a60fe830bce81a9cc181901125df19014603ca9e869985ca" gracePeriod=30 Mar 19 12:28:49.461363 master-0 kubenswrapper[29612]: I0319 12:28:49.461311 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:28:49.625686 master-0 kubenswrapper[29612]: I0319 12:28:49.625327 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-config\") pod \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " Mar 19 12:28:49.625686 master-0 kubenswrapper[29612]: I0319 12:28:49.625396 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-svc\") pod \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " Mar 19 12:28:49.625686 master-0 kubenswrapper[29612]: I0319 12:28:49.625416 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-nb\") pod \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " Mar 19 12:28:49.625686 master-0 kubenswrapper[29612]: I0319 12:28:49.625532 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-swift-storage-0\") pod \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " Mar 19 12:28:49.625686 master-0 kubenswrapper[29612]: I0319 12:28:49.625666 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-sb\") pod \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " Mar 19 12:28:49.626058 master-0 kubenswrapper[29612]: I0319 12:28:49.625749 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zt4x\" (UniqueName: \"kubernetes.io/projected/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-kube-api-access-5zt4x\") pod \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\" (UID: \"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a\") " Mar 19 12:28:49.658872 master-0 kubenswrapper[29612]: I0319 12:28:49.656915 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-kube-api-access-5zt4x" (OuterVolumeSpecName: "kube-api-access-5zt4x") pod "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" (UID: "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a"). InnerVolumeSpecName "kube-api-access-5zt4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:49.688169 master-0 kubenswrapper[29612]: I0319 12:28:49.688119 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" (UID: "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:49.699143 master-0 kubenswrapper[29612]: I0319 12:28:49.699091 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" (UID: "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:49.723179 master-0 kubenswrapper[29612]: I0319 12:28:49.723129 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-config" (OuterVolumeSpecName: "config") pod "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" (UID: "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:49.728619 master-0 kubenswrapper[29612]: I0319 12:28:49.728564 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:49.728619 master-0 kubenswrapper[29612]: I0319 12:28:49.728614 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:49.728619 master-0 kubenswrapper[29612]: I0319 12:28:49.728645 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:49.728619 master-0 kubenswrapper[29612]: I0319 12:28:49.728679 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zt4x\" (UniqueName: \"kubernetes.io/projected/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-kube-api-access-5zt4x\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:49.750377 master-0 kubenswrapper[29612]: I0319 12:28:49.750294 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" (UID: "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:49.764200 master-0 kubenswrapper[29612]: I0319 12:28:49.763109 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" (UID: "d9264c8b-fb67-4594-b8d4-4f66d1c85d0a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:28:49.830792 master-0 kubenswrapper[29612]: I0319 12:28:49.830648 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:49.830792 master-0 kubenswrapper[29612]: I0319 12:28:49.830695 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.115525 master-0 kubenswrapper[29612]: I0319 12:28:50.115454 29612 generic.go:334] "Generic (PLEG): container finished" podID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerID="dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776" exitCode=0 Mar 19 12:28:50.116106 master-0 kubenswrapper[29612]: I0319 12:28:50.115590 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56","Type":"ContainerDied","Data":"dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776"} Mar 19 12:28:50.126258 master-0 kubenswrapper[29612]: I0319 12:28:50.126056 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" event={"ID":"d9264c8b-fb67-4594-b8d4-4f66d1c85d0a","Type":"ContainerDied","Data":"9852272d8fe63e88d3ded453ef2199f16f18365d2b1cb29e3121468b1efbdb7a"} Mar 19 12:28:50.126258 master-0 kubenswrapper[29612]: I0319 12:28:50.126123 29612 scope.go:117] "RemoveContainer" containerID="4740809d17c3266968f5aeea0d8e57ca84ea0b54a6e25301d1bab684a7c607e5" Mar 19 12:28:50.126355 master-0 kubenswrapper[29612]: I0319 12:28:50.126294 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648f9ff569-fh2n6" Mar 19 12:28:50.158759 master-0 kubenswrapper[29612]: I0319 12:28:50.157029 29612 generic.go:334] "Generic (PLEG): container finished" podID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerID="68b0ac96d3681905a60fe830bce81a9cc181901125df19014603ca9e869985ca" exitCode=0 Mar 19 12:28:50.158759 master-0 kubenswrapper[29612]: I0319 12:28:50.157090 29612 generic.go:334] "Generic (PLEG): container finished" podID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerID="dca28992236f4a2ff4ddf8a2b140853b13e1168d6ff1db8be0dc847b7cb49ea9" exitCode=0 Mar 19 12:28:50.158759 master-0 kubenswrapper[29612]: I0319 12:28:50.157118 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"20ea2edf-d47b-48e9-b9de-8dac8f6b1599","Type":"ContainerDied","Data":"68b0ac96d3681905a60fe830bce81a9cc181901125df19014603ca9e869985ca"} Mar 19 12:28:50.158759 master-0 kubenswrapper[29612]: I0319 12:28:50.157154 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"20ea2edf-d47b-48e9-b9de-8dac8f6b1599","Type":"ContainerDied","Data":"dca28992236f4a2ff4ddf8a2b140853b13e1168d6ff1db8be0dc847b7cb49ea9"} Mar 19 12:28:50.251479 master-0 kubenswrapper[29612]: I0319 12:28:50.251300 29612 scope.go:117] "RemoveContainer" containerID="8e3834bd61a4bf9c10c929f982dcf0025334f699de6ca2c52ffe419a33fe9a02" Mar 19 12:28:50.255197 master-0 kubenswrapper[29612]: I0319 12:28:50.255164 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:50.310659 master-0 kubenswrapper[29612]: I0319 12:28:50.307048 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648f9ff569-fh2n6"] Mar 19 12:28:50.317358 master-0 kubenswrapper[29612]: I0319 12:28:50.317306 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-648f9ff569-fh2n6"] Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345176 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2w9x\" (UniqueName: \"kubernetes.io/projected/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-kube-api-access-p2w9x\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345263 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-machine-id\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345293 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345369 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-dev\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345407 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-cinder\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345444 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-nvme\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345479 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-iscsi\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345526 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-sys\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345571 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-brick\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345687 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-lib-cinder\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.345748 master-0 kubenswrapper[29612]: I0319 12:28:50.345745 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-run\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.346240 master-0 kubenswrapper[29612]: I0319 12:28:50.345801 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-lib-modules\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.346240 master-0 kubenswrapper[29612]: I0319 12:28:50.345828 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data-custom\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.346240 master-0 kubenswrapper[29612]: I0319 12:28:50.345870 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-scripts\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.346240 master-0 kubenswrapper[29612]: I0319 12:28:50.345917 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-combined-ca-bundle\") pod \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\" (UID: \"20ea2edf-d47b-48e9-b9de-8dac8f6b1599\") " Mar 19 12:28:50.346514 master-0 kubenswrapper[29612]: I0319 12:28:50.346470 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.346784 master-0 kubenswrapper[29612]: I0319 12:28:50.346750 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-sys" (OuterVolumeSpecName: "sys") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.346848 master-0 kubenswrapper[29612]: I0319 12:28:50.346805 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.346848 master-0 kubenswrapper[29612]: I0319 12:28:50.346837 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.346918 master-0 kubenswrapper[29612]: I0319 12:28:50.346864 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-run" (OuterVolumeSpecName: "run") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.346918 master-0 kubenswrapper[29612]: I0319 12:28:50.346894 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.348672 master-0 kubenswrapper[29612]: I0319 12:28:50.347608 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-dev" (OuterVolumeSpecName: "dev") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.348672 master-0 kubenswrapper[29612]: I0319 12:28:50.347689 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.351667 master-0 kubenswrapper[29612]: I0319 12:28:50.351489 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.351792 master-0 kubenswrapper[29612]: I0319 12:28:50.351713 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.351792 master-0 kubenswrapper[29612]: I0319 12:28:50.351762 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.355659 master-0 kubenswrapper[29612]: I0319 12:28:50.352529 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-scripts" (OuterVolumeSpecName: "scripts") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.361789 master-0 kubenswrapper[29612]: I0319 12:28:50.358958 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-kube-api-access-p2w9x" (OuterVolumeSpecName: "kube-api-access-p2w9x") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "kube-api-access-p2w9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:50.428744 master-0 kubenswrapper[29612]: I0319 12:28:50.428678 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472598 29612 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472662 29612 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472677 29612 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472690 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472700 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472709 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472719 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2w9x\" (UniqueName: \"kubernetes.io/projected/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-kube-api-access-p2w9x\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472729 29612 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472740 29612 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-dev\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472750 29612 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472758 29612 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472766 29612 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472774 29612 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-sys\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.475667 master-0 kubenswrapper[29612]: I0319 12:28:50.472782 29612 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.481155 master-0 kubenswrapper[29612]: I0319 12:28:50.481075 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data" (OuterVolumeSpecName: "config-data") pod "20ea2edf-d47b-48e9-b9de-8dac8f6b1599" (UID: "20ea2edf-d47b-48e9-b9de-8dac8f6b1599"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.574675 master-0 kubenswrapper[29612]: I0319 12:28:50.574577 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea2edf-d47b-48e9-b9de-8dac8f6b1599-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.672560 master-0 kubenswrapper[29612]: I0319 12:28:50.672493 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.778825 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-brick\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.778910 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-dev\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779071 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-lib-cinder\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779129 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-run\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779188 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779231 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-scripts\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779272 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data-custom\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779325 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-iscsi\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779512 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-nvme\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779658 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-sys\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779726 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-lib-modules\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779791 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-cinder\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779901 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-machine-id\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.779960 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpxjg\" (UniqueName: \"kubernetes.io/projected/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-kube-api-access-gpxjg\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.780566 master-0 kubenswrapper[29612]: I0319 12:28:50.780035 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-combined-ca-bundle\") pod \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\" (UID: \"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56\") " Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785496 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785580 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-sys" (OuterVolumeSpecName: "sys") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785592 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785623 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785665 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785689 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785693 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-dev" (OuterVolumeSpecName: "dev") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785717 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-run" (OuterVolumeSpecName: "run") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785739 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.785793 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.789160 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-kube-api-access-gpxjg" (OuterVolumeSpecName: "kube-api-access-gpxjg") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "kube-api-access-gpxjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.789514 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-scripts" (OuterVolumeSpecName: "scripts") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.799478 master-0 kubenswrapper[29612]: I0319 12:28:50.789825 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.838424 master-0 kubenswrapper[29612]: I0319 12:28:50.838355 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882656 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882707 29612 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882722 29612 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-dev\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882733 29612 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882745 29612 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882755 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882765 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882776 29612 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882787 29612 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.882776 master-0 kubenswrapper[29612]: I0319 12:28:50.882797 29612 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-sys\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.883325 master-0 kubenswrapper[29612]: I0319 12:28:50.882806 29612 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.883325 master-0 kubenswrapper[29612]: I0319 12:28:50.882815 29612 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.883325 master-0 kubenswrapper[29612]: I0319 12:28:50.882823 29612 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.883325 master-0 kubenswrapper[29612]: I0319 12:28:50.882834 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpxjg\" (UniqueName: \"kubernetes.io/projected/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-kube-api-access-gpxjg\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:50.913780 master-0 kubenswrapper[29612]: I0319 12:28:50.913612 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data" (OuterVolumeSpecName: "config-data") pod "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" (UID: "dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:50.929109 master-0 kubenswrapper[29612]: I0319 12:28:50.929061 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" path="/var/lib/kubelet/pods/d9264c8b-fb67-4594-b8d4-4f66d1c85d0a/volumes" Mar 19 12:28:50.984361 master-0 kubenswrapper[29612]: I0319 12:28:50.984299 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:51.169818 master-0 kubenswrapper[29612]: I0319 12:28:51.169647 29612 generic.go:334] "Generic (PLEG): container finished" podID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerID="6d53d02fe19f678d8261595e2ac1ecd5b4f56d980764d11759be33561b566cbc" exitCode=0 Mar 19 12:28:51.169818 master-0 kubenswrapper[29612]: I0319 12:28:51.169728 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"925bacb7-bb6c-4920-9e82-e20e8fad09c6","Type":"ContainerDied","Data":"6d53d02fe19f678d8261595e2ac1ecd5b4f56d980764d11759be33561b566cbc"} Mar 19 12:28:51.172529 master-0 kubenswrapper[29612]: I0319 12:28:51.172482 29612 generic.go:334] "Generic (PLEG): container finished" podID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerID="7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53" exitCode=0 Mar 19 12:28:51.172603 master-0 kubenswrapper[29612]: I0319 12:28:51.172544 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56","Type":"ContainerDied","Data":"7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53"} Mar 19 12:28:51.172603 master-0 kubenswrapper[29612]: I0319 12:28:51.172572 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56","Type":"ContainerDied","Data":"1a5955a31cfdf08b068c5b3a1e36fd8273a90cb35c85732e1021520569965c18"} Mar 19 12:28:51.172603 master-0 kubenswrapper[29612]: I0319 12:28:51.172588 29612 scope.go:117] "RemoveContainer" containerID="dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776" Mar 19 12:28:51.172755 master-0 kubenswrapper[29612]: I0319 12:28:51.172730 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.198204 master-0 kubenswrapper[29612]: I0319 12:28:51.198136 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"20ea2edf-d47b-48e9-b9de-8dac8f6b1599","Type":"ContainerDied","Data":"af75810c18c1c62f1fd77ba1d08b8dc6af8703542da0051daa6ad4fd3685aecc"} Mar 19 12:28:51.198467 master-0 kubenswrapper[29612]: I0319 12:28:51.198208 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.200780 master-0 kubenswrapper[29612]: I0319 12:28:51.200743 29612 scope.go:117] "RemoveContainer" containerID="7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53" Mar 19 12:28:51.245796 master-0 kubenswrapper[29612]: I0319 12:28:51.245193 29612 scope.go:117] "RemoveContainer" containerID="dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776" Mar 19 12:28:51.245897 master-0 kubenswrapper[29612]: E0319 12:28:51.245851 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776\": container with ID starting with dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776 not found: ID does not exist" containerID="dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776" Mar 19 12:28:51.245941 master-0 kubenswrapper[29612]: I0319 12:28:51.245906 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776"} err="failed to get container status \"dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776\": rpc error: code = NotFound desc = could not find container \"dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776\": container with ID starting with dbd5a1aa15a0e1fa7e1250ef4b09a1c6701718c7b34946ffe35e2e7bec9a1776 not found: ID does not exist" Mar 19 12:28:51.245941 master-0 kubenswrapper[29612]: I0319 12:28:51.245934 29612 scope.go:117] "RemoveContainer" containerID="7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53" Mar 19 12:28:51.247083 master-0 kubenswrapper[29612]: I0319 12:28:51.246043 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:51.247083 master-0 kubenswrapper[29612]: E0319 12:28:51.246556 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53\": container with ID starting with 7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53 not found: ID does not exist" containerID="7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53" Mar 19 12:28:51.247083 master-0 kubenswrapper[29612]: I0319 12:28:51.246578 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53"} err="failed to get container status \"7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53\": rpc error: code = NotFound desc = could not find container \"7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53\": container with ID starting with 7a80baea0c83194fb86aa99b6fd283efb4e2ec78dbc2dd099b0b234f1513ea53 not found: ID does not exist" Mar 19 12:28:51.247083 master-0 kubenswrapper[29612]: I0319 12:28:51.246593 29612 scope.go:117] "RemoveContainer" containerID="68b0ac96d3681905a60fe830bce81a9cc181901125df19014603ca9e869985ca" Mar 19 12:28:51.282269 master-0 kubenswrapper[29612]: I0319 12:28:51.282157 29612 scope.go:117] "RemoveContainer" containerID="dca28992236f4a2ff4ddf8a2b140853b13e1168d6ff1db8be0dc847b7cb49ea9" Mar 19 12:28:51.328803 master-0 kubenswrapper[29612]: I0319 12:28:51.328729 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:51.350516 master-0 kubenswrapper[29612]: I0319 12:28:51.350433 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:51.395825 master-0 kubenswrapper[29612]: I0319 12:28:51.395768 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:51.405175 master-0 kubenswrapper[29612]: I0319 12:28:51.405104 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:51.405680 master-0 kubenswrapper[29612]: E0319 12:28:51.405653 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerName="dnsmasq-dns" Mar 19 12:28:51.405680 master-0 kubenswrapper[29612]: I0319 12:28:51.405674 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerName="dnsmasq-dns" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: E0319 12:28:51.405686 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="cinder-backup" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: I0319 12:28:51.405693 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="cinder-backup" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: E0319 12:28:51.405705 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="probe" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: I0319 12:28:51.405711 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="probe" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: E0319 12:28:51.405727 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="cinder-volume" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: I0319 12:28:51.405733 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="cinder-volume" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: E0319 12:28:51.405766 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="probe" Mar 19 12:28:51.405771 master-0 kubenswrapper[29612]: I0319 12:28:51.405772 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="probe" Mar 19 12:28:51.406010 master-0 kubenswrapper[29612]: E0319 12:28:51.405786 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerName="init" Mar 19 12:28:51.406010 master-0 kubenswrapper[29612]: I0319 12:28:51.405793 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerName="init" Mar 19 12:28:51.406074 master-0 kubenswrapper[29612]: I0319 12:28:51.406031 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="cinder-volume" Mar 19 12:28:51.406074 master-0 kubenswrapper[29612]: I0319 12:28:51.406055 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9264c8b-fb67-4594-b8d4-4f66d1c85d0a" containerName="dnsmasq-dns" Mar 19 12:28:51.406130 master-0 kubenswrapper[29612]: I0319 12:28:51.406109 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="probe" Mar 19 12:28:51.406130 master-0 kubenswrapper[29612]: I0319 12:28:51.406128 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" containerName="probe" Mar 19 12:28:51.406187 master-0 kubenswrapper[29612]: I0319 12:28:51.406157 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" containerName="cinder-backup" Mar 19 12:28:51.407427 master-0 kubenswrapper[29612]: I0319 12:28:51.407392 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.413582 master-0 kubenswrapper[29612]: I0319 12:28:51.413238 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-backup-config-data" Mar 19 12:28:51.414902 master-0 kubenswrapper[29612]: I0319 12:28:51.414285 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:51.419119 master-0 kubenswrapper[29612]: I0319 12:28:51.418939 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.424553 master-0 kubenswrapper[29612]: I0319 12:28:51.423328 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-volume-lvm-iscsi-config-data" Mar 19 12:28:51.424553 master-0 kubenswrapper[29612]: I0319 12:28:51.423436 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:51.432451 master-0 kubenswrapper[29612]: I0319 12:28:51.432367 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.503842 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-locks-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.503905 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-sys\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.503941 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-iscsi\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.503977 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-run\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504005 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-sys\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504021 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-combined-ca-bundle\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504053 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-config-data\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504076 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-iscsi\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504107 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-lib-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504131 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-lib-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504149 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-config-data-custom\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504168 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w4r\" (UniqueName: \"kubernetes.io/projected/ba50f14f-7592-416f-857b-a32b0149351e-kube-api-access-h2w4r\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504188 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-locks-brick\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504205 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-scripts\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504218 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-lib-modules\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504237 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-config-data\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504261 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-nvme\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504282 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-nvme\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504301 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-dev\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504332 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-dev\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504349 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-config-data-custom\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504362 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-run\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504379 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-machine-id\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504398 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-locks-brick\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504420 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-scripts\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504437 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89fn5\" (UniqueName: \"kubernetes.io/projected/f89781aa-829f-40c6-8990-671e33cb036e-kube-api-access-89fn5\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504458 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-machine-id\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504475 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-locks-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504493 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-lib-modules\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.504717 master-0 kubenswrapper[29612]: I0319 12:28:51.504507 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-combined-ca-bundle\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.606954 master-0 kubenswrapper[29612]: I0319 12:28:51.606842 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-dev\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607213 master-0 kubenswrapper[29612]: I0319 12:28:51.606990 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-config-data-custom\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607213 master-0 kubenswrapper[29612]: I0319 12:28:51.607023 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-dev\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607213 master-0 kubenswrapper[29612]: I0319 12:28:51.607051 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-run\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.607213 master-0 kubenswrapper[29612]: I0319 12:28:51.607196 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-machine-id\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607563 master-0 kubenswrapper[29612]: I0319 12:28:51.607277 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-locks-brick\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607563 master-0 kubenswrapper[29612]: I0319 12:28:51.607327 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-run\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.607563 master-0 kubenswrapper[29612]: I0319 12:28:51.607382 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-scripts\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607563 master-0 kubenswrapper[29612]: I0319 12:28:51.607493 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89fn5\" (UniqueName: \"kubernetes.io/projected/f89781aa-829f-40c6-8990-671e33cb036e-kube-api-access-89fn5\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.607870 master-0 kubenswrapper[29612]: I0319 12:28:51.607752 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-machine-id\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607870 master-0 kubenswrapper[29612]: I0319 12:28:51.607849 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-machine-id\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.607990 master-0 kubenswrapper[29612]: I0319 12:28:51.607891 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-locks-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607990 master-0 kubenswrapper[29612]: I0319 12:28:51.607929 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-lib-modules\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.607990 master-0 kubenswrapper[29612]: I0319 12:28:51.607952 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-combined-ca-bundle\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608134 master-0 kubenswrapper[29612]: I0319 12:28:51.608072 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-locks-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608134 master-0 kubenswrapper[29612]: I0319 12:28:51.608101 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-sys\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608236 master-0 kubenswrapper[29612]: I0319 12:28:51.608182 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-iscsi\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608284 master-0 kubenswrapper[29612]: I0319 12:28:51.608267 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-run\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608336 master-0 kubenswrapper[29612]: I0319 12:28:51.608324 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-sys\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608396 master-0 kubenswrapper[29612]: I0319 12:28:51.608385 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-run\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608445 master-0 kubenswrapper[29612]: I0319 12:28:51.608406 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-combined-ca-bundle\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608497 master-0 kubenswrapper[29612]: I0319 12:28:51.608465 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-config-data\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608497 master-0 kubenswrapper[29612]: I0319 12:28:51.608485 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-iscsi\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608591 master-0 kubenswrapper[29612]: I0319 12:28:51.608531 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-lib-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608591 master-0 kubenswrapper[29612]: I0319 12:28:51.608571 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-lib-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608710 master-0 kubenswrapper[29612]: I0319 12:28:51.608595 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-config-data-custom\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608710 master-0 kubenswrapper[29612]: I0319 12:28:51.608626 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w4r\" (UniqueName: \"kubernetes.io/projected/ba50f14f-7592-416f-857b-a32b0149351e-kube-api-access-h2w4r\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608710 master-0 kubenswrapper[29612]: I0319 12:28:51.608669 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-locks-brick\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608710 master-0 kubenswrapper[29612]: I0319 12:28:51.608699 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-scripts\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608872 master-0 kubenswrapper[29612]: I0319 12:28:51.608721 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-lib-modules\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608872 master-0 kubenswrapper[29612]: I0319 12:28:51.608761 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-config-data\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608872 master-0 kubenswrapper[29612]: I0319 12:28:51.608800 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-nvme\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.608872 master-0 kubenswrapper[29612]: I0319 12:28:51.608829 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-nvme\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.608872 master-0 kubenswrapper[29612]: I0319 12:28:51.608856 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-dev\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.609094 master-0 kubenswrapper[29612]: I0319 12:28:51.608979 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-dev\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.609094 master-0 kubenswrapper[29612]: I0319 12:28:51.609042 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-locks-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.609094 master-0 kubenswrapper[29612]: I0319 12:28:51.609072 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-sys\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.609234 master-0 kubenswrapper[29612]: I0319 12:28:51.609100 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-iscsi\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.610967 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-scripts\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611085 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-locks-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611146 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-locks-brick\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611188 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-machine-id\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611692 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-lib-modules\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611747 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-lib-modules\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611782 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-lib-cinder\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611809 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-nvme\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611827 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-etc-iscsi\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611818 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/ba50f14f-7592-416f-857b-a32b0149351e-var-lib-cinder\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.611869 master-0 kubenswrapper[29612]: I0319 12:28:51.611874 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-sys\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.612649 master-0 kubenswrapper[29612]: I0319 12:28:51.611921 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-var-locks-brick\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.612649 master-0 kubenswrapper[29612]: I0319 12:28:51.611980 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f89781aa-829f-40c6-8990-671e33cb036e-etc-nvme\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.612764 master-0 kubenswrapper[29612]: I0319 12:28:51.612668 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-combined-ca-bundle\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.614585 master-0 kubenswrapper[29612]: I0319 12:28:51.614249 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-combined-ca-bundle\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.614939 master-0 kubenswrapper[29612]: I0319 12:28:51.614902 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-config-data-custom\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.616113 master-0 kubenswrapper[29612]: I0319 12:28:51.616086 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-scripts\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.616542 master-0 kubenswrapper[29612]: I0319 12:28:51.616500 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50f14f-7592-416f-857b-a32b0149351e-config-data\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.624795 master-0 kubenswrapper[29612]: I0319 12:28:51.623836 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-config-data-custom\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.624795 master-0 kubenswrapper[29612]: I0319 12:28:51.624359 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f89781aa-829f-40c6-8990-671e33cb036e-config-data\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.628517 master-0 kubenswrapper[29612]: I0319 12:28:51.628457 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89fn5\" (UniqueName: \"kubernetes.io/projected/f89781aa-829f-40c6-8990-671e33cb036e-kube-api-access-89fn5\") pod \"cinder-562cb-volume-lvm-iscsi-0\" (UID: \"f89781aa-829f-40c6-8990-671e33cb036e\") " pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:51.634835 master-0 kubenswrapper[29612]: I0319 12:28:51.634786 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w4r\" (UniqueName: \"kubernetes.io/projected/ba50f14f-7592-416f-857b-a32b0149351e-kube-api-access-h2w4r\") pod \"cinder-562cb-backup-0\" (UID: \"ba50f14f-7592-416f-857b-a32b0149351e\") " pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.746662 master-0 kubenswrapper[29612]: I0319 12:28:51.741581 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:51.765658 master-0 kubenswrapper[29612]: I0319 12:28:51.763459 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:52.319123 master-0 kubenswrapper[29612]: I0319 12:28:52.318506 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:52.326623 master-0 kubenswrapper[29612]: I0319 12:28:52.326579 29612 generic.go:334] "Generic (PLEG): container finished" podID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerID="dce1b5eed86fe28ef448c5369a1ae7df25615dd2bda48a9ca8ef661f54146028" exitCode=0 Mar 19 12:28:52.326909 master-0 kubenswrapper[29612]: I0319 12:28:52.326890 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"925bacb7-bb6c-4920-9e82-e20e8fad09c6","Type":"ContainerDied","Data":"dce1b5eed86fe28ef448c5369a1ae7df25615dd2bda48a9ca8ef661f54146028"} Mar 19 12:28:52.327005 master-0 kubenswrapper[29612]: I0319 12:28:52.326991 29612 scope.go:117] "RemoveContainer" containerID="6d53d02fe19f678d8261595e2ac1ecd5b4f56d980764d11759be33561b566cbc" Mar 19 12:28:52.420329 master-0 kubenswrapper[29612]: I0319 12:28:52.420281 29612 scope.go:117] "RemoveContainer" containerID="dce1b5eed86fe28ef448c5369a1ae7df25615dd2bda48a9ca8ef661f54146028" Mar 19 12:28:52.447171 master-0 kubenswrapper[29612]: I0319 12:28:52.446514 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-combined-ca-bundle\") pod \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " Mar 19 12:28:52.447171 master-0 kubenswrapper[29612]: I0319 12:28:52.446597 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv66k\" (UniqueName: \"kubernetes.io/projected/925bacb7-bb6c-4920-9e82-e20e8fad09c6-kube-api-access-xv66k\") pod \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " Mar 19 12:28:52.447171 master-0 kubenswrapper[29612]: I0319 12:28:52.446631 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data-custom\") pod \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " Mar 19 12:28:52.447171 master-0 kubenswrapper[29612]: I0319 12:28:52.446681 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-scripts\") pod \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " Mar 19 12:28:52.447171 master-0 kubenswrapper[29612]: I0319 12:28:52.446798 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925bacb7-bb6c-4920-9e82-e20e8fad09c6-etc-machine-id\") pod \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " Mar 19 12:28:52.447171 master-0 kubenswrapper[29612]: I0319 12:28:52.446929 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data\") pod \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\" (UID: \"925bacb7-bb6c-4920-9e82-e20e8fad09c6\") " Mar 19 12:28:52.452784 master-0 kubenswrapper[29612]: I0319 12:28:52.452649 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "925bacb7-bb6c-4920-9e82-e20e8fad09c6" (UID: "925bacb7-bb6c-4920-9e82-e20e8fad09c6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:52.454719 master-0 kubenswrapper[29612]: I0319 12:28:52.454645 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/925bacb7-bb6c-4920-9e82-e20e8fad09c6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "925bacb7-bb6c-4920-9e82-e20e8fad09c6" (UID: "925bacb7-bb6c-4920-9e82-e20e8fad09c6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 12:28:52.473509 master-0 kubenswrapper[29612]: I0319 12:28:52.472586 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925bacb7-bb6c-4920-9e82-e20e8fad09c6-kube-api-access-xv66k" (OuterVolumeSpecName: "kube-api-access-xv66k") pod "925bacb7-bb6c-4920-9e82-e20e8fad09c6" (UID: "925bacb7-bb6c-4920-9e82-e20e8fad09c6"). InnerVolumeSpecName "kube-api-access-xv66k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:52.474751 master-0 kubenswrapper[29612]: I0319 12:28:52.474729 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-scripts" (OuterVolumeSpecName: "scripts") pod "925bacb7-bb6c-4920-9e82-e20e8fad09c6" (UID: "925bacb7-bb6c-4920-9e82-e20e8fad09c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:52.560980 master-0 kubenswrapper[29612]: I0319 12:28:52.560911 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "925bacb7-bb6c-4920-9e82-e20e8fad09c6" (UID: "925bacb7-bb6c-4920-9e82-e20e8fad09c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:52.566473 master-0 kubenswrapper[29612]: I0319 12:28:52.566422 29612 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/925bacb7-bb6c-4920-9e82-e20e8fad09c6-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:52.566473 master-0 kubenswrapper[29612]: I0319 12:28:52.566471 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:52.566759 master-0 kubenswrapper[29612]: I0319 12:28:52.566484 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv66k\" (UniqueName: \"kubernetes.io/projected/925bacb7-bb6c-4920-9e82-e20e8fad09c6-kube-api-access-xv66k\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:52.566759 master-0 kubenswrapper[29612]: I0319 12:28:52.566496 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:52.566759 master-0 kubenswrapper[29612]: I0319 12:28:52.566505 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:52.571124 master-0 kubenswrapper[29612]: I0319 12:28:52.571022 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-backup-0"] Mar 19 12:28:52.592961 master-0 kubenswrapper[29612]: W0319 12:28:52.592896 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba50f14f_7592_416f_857b_a32b0149351e.slice/crio-ebe00f6d4f13bd0101153595d87c4026d90b046f293d959e0f8eb29fb76b9de3 WatchSource:0}: Error finding container ebe00f6d4f13bd0101153595d87c4026d90b046f293d959e0f8eb29fb76b9de3: Status 404 returned error can't find the container with id ebe00f6d4f13bd0101153595d87c4026d90b046f293d959e0f8eb29fb76b9de3 Mar 19 12:28:52.609993 master-0 kubenswrapper[29612]: I0319 12:28:52.609915 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data" (OuterVolumeSpecName: "config-data") pod "925bacb7-bb6c-4920-9e82-e20e8fad09c6" (UID: "925bacb7-bb6c-4920-9e82-e20e8fad09c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:52.654790 master-0 kubenswrapper[29612]: I0319 12:28:52.654698 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-volume-lvm-iscsi-0"] Mar 19 12:28:52.661842 master-0 kubenswrapper[29612]: W0319 12:28:52.661782 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89781aa_829f_40c6_8990_671e33cb036e.slice/crio-4f0ca47071c26d4d3b0aa3085a3e304f30bde0a0ea18c34752a6b9eaa845e278 WatchSource:0}: Error finding container 4f0ca47071c26d4d3b0aa3085a3e304f30bde0a0ea18c34752a6b9eaa845e278: Status 404 returned error can't find the container with id 4f0ca47071c26d4d3b0aa3085a3e304f30bde0a0ea18c34752a6b9eaa845e278 Mar 19 12:28:52.669102 master-0 kubenswrapper[29612]: I0319 12:28:52.669039 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/925bacb7-bb6c-4920-9e82-e20e8fad09c6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:52.923032 master-0 kubenswrapper[29612]: I0319 12:28:52.922976 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ea2edf-d47b-48e9-b9de-8dac8f6b1599" path="/var/lib/kubelet/pods/20ea2edf-d47b-48e9-b9de-8dac8f6b1599/volumes" Mar 19 12:28:52.923820 master-0 kubenswrapper[29612]: I0319 12:28:52.923795 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56" path="/var/lib/kubelet/pods/dc613b5c-2eaa-45fa-a8fa-ae2a4840ad56/volumes" Mar 19 12:28:53.369220 master-0 kubenswrapper[29612]: I0319 12:28:53.369108 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"925bacb7-bb6c-4920-9e82-e20e8fad09c6","Type":"ContainerDied","Data":"c4f2c7e78cd0dab4eb2dd3aa1a2e115fdab0c94e0c9537f158f5e154e86404b7"} Mar 19 12:28:53.369220 master-0 kubenswrapper[29612]: I0319 12:28:53.369204 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.383723 master-0 kubenswrapper[29612]: I0319 12:28:53.383294 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"ba50f14f-7592-416f-857b-a32b0149351e","Type":"ContainerStarted","Data":"835e8c7dae56a265d217d4223b0f478a7ce4285255a9d8a072abee39991f4cac"} Mar 19 12:28:53.383723 master-0 kubenswrapper[29612]: I0319 12:28:53.383345 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"ba50f14f-7592-416f-857b-a32b0149351e","Type":"ContainerStarted","Data":"797f412d4211d13cc620fef81849f5cee66dd559e7c279418cd2d2af853860d5"} Mar 19 12:28:53.383723 master-0 kubenswrapper[29612]: I0319 12:28:53.383357 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-backup-0" event={"ID":"ba50f14f-7592-416f-857b-a32b0149351e","Type":"ContainerStarted","Data":"ebe00f6d4f13bd0101153595d87c4026d90b046f293d959e0f8eb29fb76b9de3"} Mar 19 12:28:53.391220 master-0 kubenswrapper[29612]: I0319 12:28:53.389593 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"f89781aa-829f-40c6-8990-671e33cb036e","Type":"ContainerStarted","Data":"4675a6ca4378e01850dbe470b472a698d5a6479431cfd6b9d050f5dbf1a1ff1c"} Mar 19 12:28:53.391220 master-0 kubenswrapper[29612]: I0319 12:28:53.389678 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"f89781aa-829f-40c6-8990-671e33cb036e","Type":"ContainerStarted","Data":"5419ed300de56968d00d9278764cfe0fe2852daa2ea12cbf95c91ede26c5d235"} Mar 19 12:28:53.391220 master-0 kubenswrapper[29612]: I0319 12:28:53.389689 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" event={"ID":"f89781aa-829f-40c6-8990-671e33cb036e","Type":"ContainerStarted","Data":"4f0ca47071c26d4d3b0aa3085a3e304f30bde0a0ea18c34752a6b9eaa845e278"} Mar 19 12:28:53.456586 master-0 kubenswrapper[29612]: I0319 12:28:53.456518 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:53.518663 master-0 kubenswrapper[29612]: I0319 12:28:53.517829 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:53.538723 master-0 kubenswrapper[29612]: I0319 12:28:53.529117 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" podStartSLOduration=2.529091885 podStartE2EDuration="2.529091885s" podCreationTimestamp="2026-03-19 12:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:53.525665247 +0000 UTC m=+1140.839834268" watchObservedRunningTime="2026-03-19 12:28:53.529091885 +0000 UTC m=+1140.843260916" Mar 19 12:28:53.565660 master-0 kubenswrapper[29612]: I0319 12:28:53.565547 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: E0319 12:28:53.566194 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="cinder-scheduler" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.566223 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="cinder-scheduler" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: E0319 12:28:53.566266 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="probe" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.566276 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="probe" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.566545 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="probe" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.566613 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" containerName="cinder-scheduler" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.568254 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.573165 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-562cb-scheduler-config-data" Mar 19 12:28:53.593801 master-0 kubenswrapper[29612]: I0319 12:28:53.593373 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:53.725658 master-0 kubenswrapper[29612]: I0319 12:28:53.723887 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-config-data-custom\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.725658 master-0 kubenswrapper[29612]: I0319 12:28:53.723996 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j78rz\" (UniqueName: \"kubernetes.io/projected/61a20cf7-7849-4f23-80b4-1189ca420061-kube-api-access-j78rz\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.725658 master-0 kubenswrapper[29612]: I0319 12:28:53.724030 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-combined-ca-bundle\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.725658 master-0 kubenswrapper[29612]: I0319 12:28:53.724069 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61a20cf7-7849-4f23-80b4-1189ca420061-etc-machine-id\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.725658 master-0 kubenswrapper[29612]: I0319 12:28:53.724089 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-config-data\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.725658 master-0 kubenswrapper[29612]: I0319 12:28:53.724136 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-scripts\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.827164 master-0 kubenswrapper[29612]: I0319 12:28:53.827081 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-config-data-custom\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.828455 master-0 kubenswrapper[29612]: I0319 12:28:53.827800 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j78rz\" (UniqueName: \"kubernetes.io/projected/61a20cf7-7849-4f23-80b4-1189ca420061-kube-api-access-j78rz\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.828455 master-0 kubenswrapper[29612]: I0319 12:28:53.827843 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-combined-ca-bundle\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.828455 master-0 kubenswrapper[29612]: I0319 12:28:53.827907 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61a20cf7-7849-4f23-80b4-1189ca420061-etc-machine-id\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.828455 master-0 kubenswrapper[29612]: I0319 12:28:53.827934 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-config-data\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.828455 master-0 kubenswrapper[29612]: I0319 12:28:53.828005 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-scripts\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.829087 master-0 kubenswrapper[29612]: I0319 12:28:53.828900 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61a20cf7-7849-4f23-80b4-1189ca420061-etc-machine-id\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.832362 master-0 kubenswrapper[29612]: I0319 12:28:53.830677 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-config-data-custom\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.832362 master-0 kubenswrapper[29612]: I0319 12:28:53.831978 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-combined-ca-bundle\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.832362 master-0 kubenswrapper[29612]: I0319 12:28:53.832321 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-scripts\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.835314 master-0 kubenswrapper[29612]: I0319 12:28:53.835199 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a20cf7-7849-4f23-80b4-1189ca420061-config-data\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.851282 master-0 kubenswrapper[29612]: I0319 12:28:53.851217 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j78rz\" (UniqueName: \"kubernetes.io/projected/61a20cf7-7849-4f23-80b4-1189ca420061-kube-api-access-j78rz\") pod \"cinder-562cb-scheduler-0\" (UID: \"61a20cf7-7849-4f23-80b4-1189ca420061\") " pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:53.960510 master-0 kubenswrapper[29612]: I0319 12:28:53.960416 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:54.494201 master-0 kubenswrapper[29612]: I0319 12:28:54.494080 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-backup-0" podStartSLOduration=3.494052176 podStartE2EDuration="3.494052176s" podCreationTimestamp="2026-03-19 12:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:54.433041392 +0000 UTC m=+1141.747210423" watchObservedRunningTime="2026-03-19 12:28:54.494052176 +0000 UTC m=+1141.808221197" Mar 19 12:28:54.556534 master-0 kubenswrapper[29612]: I0319 12:28:54.556209 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-562cb-scheduler-0"] Mar 19 12:28:54.946610 master-0 kubenswrapper[29612]: I0319 12:28:54.946540 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925bacb7-bb6c-4920-9e82-e20e8fad09c6" path="/var/lib/kubelet/pods/925bacb7-bb6c-4920-9e82-e20e8fad09c6/volumes" Mar 19 12:28:55.452146 master-0 kubenswrapper[29612]: I0319 12:28:55.449394 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"61a20cf7-7849-4f23-80b4-1189ca420061","Type":"ContainerStarted","Data":"2c454938a437a0b4c7dd1b660067d572baedd5b10e9efa24f1d93449120d1f68"} Mar 19 12:28:55.452146 master-0 kubenswrapper[29612]: I0319 12:28:55.449450 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"61a20cf7-7849-4f23-80b4-1189ca420061","Type":"ContainerStarted","Data":"b03d594b752bb72267664f5ab682312bcb0880d78d4f02f891a9496bc857fd07"} Mar 19 12:28:55.458507 master-0 kubenswrapper[29612]: I0319 12:28:55.458374 29612 generic.go:334] "Generic (PLEG): container finished" podID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerID="d2fa3995bcd0e93d8313db25ff149ead5cd7e66149e130835cc71f793264989b" exitCode=0 Mar 19 12:28:55.458813 master-0 kubenswrapper[29612]: I0319 12:28:55.458752 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-w5dht" event={"ID":"6dd57bb3-eafc-41af-a7a0-794cd736adaa","Type":"ContainerDied","Data":"d2fa3995bcd0e93d8313db25ff149ead5cd7e66149e130835cc71f793264989b"} Mar 19 12:28:56.479493 master-0 kubenswrapper[29612]: I0319 12:28:56.478340 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-562cb-scheduler-0" event={"ID":"61a20cf7-7849-4f23-80b4-1189ca420061","Type":"ContainerStarted","Data":"f55b870a9eb7ddc035cc204232b367eac464a779f95a3345fe4af5933765e596"} Mar 19 12:28:56.508681 master-0 kubenswrapper[29612]: I0319 12:28:56.505285 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-562cb-scheduler-0" podStartSLOduration=3.505263368 podStartE2EDuration="3.505263368s" podCreationTimestamp="2026-03-19 12:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:28:56.502252282 +0000 UTC m=+1143.816421313" watchObservedRunningTime="2026-03-19 12:28:56.505263368 +0000 UTC m=+1143.819432389" Mar 19 12:28:56.747824 master-0 kubenswrapper[29612]: I0319 12:28:56.747068 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-562cb-backup-0" Mar 19 12:28:56.764846 master-0 kubenswrapper[29612]: I0319 12:28:56.764520 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:28:57.016367 master-0 kubenswrapper[29612]: I0319 12:28:57.016188 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:57.178451 master-0 kubenswrapper[29612]: I0319 12:28:57.178379 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data\") pod \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " Mar 19 12:28:57.178963 master-0 kubenswrapper[29612]: I0319 12:28:57.178519 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-combined-ca-bundle\") pod \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " Mar 19 12:28:57.178963 master-0 kubenswrapper[29612]: I0319 12:28:57.178579 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data-merged\") pod \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " Mar 19 12:28:57.178963 master-0 kubenswrapper[29612]: I0319 12:28:57.178615 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4g76\" (UniqueName: \"kubernetes.io/projected/6dd57bb3-eafc-41af-a7a0-794cd736adaa-kube-api-access-g4g76\") pod \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " Mar 19 12:28:57.178963 master-0 kubenswrapper[29612]: I0319 12:28:57.178670 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-scripts\") pod \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " Mar 19 12:28:57.178963 master-0 kubenswrapper[29612]: I0319 12:28:57.178693 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6dd57bb3-eafc-41af-a7a0-794cd736adaa-etc-podinfo\") pod \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\" (UID: \"6dd57bb3-eafc-41af-a7a0-794cd736adaa\") " Mar 19 12:28:57.180549 master-0 kubenswrapper[29612]: I0319 12:28:57.180485 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "6dd57bb3-eafc-41af-a7a0-794cd736adaa" (UID: "6dd57bb3-eafc-41af-a7a0-794cd736adaa"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:28:57.183835 master-0 kubenswrapper[29612]: I0319 12:28:57.183695 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd57bb3-eafc-41af-a7a0-794cd736adaa-kube-api-access-g4g76" (OuterVolumeSpecName: "kube-api-access-g4g76") pod "6dd57bb3-eafc-41af-a7a0-794cd736adaa" (UID: "6dd57bb3-eafc-41af-a7a0-794cd736adaa"). InnerVolumeSpecName "kube-api-access-g4g76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:28:57.183835 master-0 kubenswrapper[29612]: I0319 12:28:57.183778 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6dd57bb3-eafc-41af-a7a0-794cd736adaa-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "6dd57bb3-eafc-41af-a7a0-794cd736adaa" (UID: "6dd57bb3-eafc-41af-a7a0-794cd736adaa"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 12:28:57.196929 master-0 kubenswrapper[29612]: I0319 12:28:57.196833 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-scripts" (OuterVolumeSpecName: "scripts") pod "6dd57bb3-eafc-41af-a7a0-794cd736adaa" (UID: "6dd57bb3-eafc-41af-a7a0-794cd736adaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:57.211740 master-0 kubenswrapper[29612]: I0319 12:28:57.211600 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data" (OuterVolumeSpecName: "config-data") pod "6dd57bb3-eafc-41af-a7a0-794cd736adaa" (UID: "6dd57bb3-eafc-41af-a7a0-794cd736adaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:57.264928 master-0 kubenswrapper[29612]: I0319 12:28:57.264862 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6dd57bb3-eafc-41af-a7a0-794cd736adaa" (UID: "6dd57bb3-eafc-41af-a7a0-794cd736adaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:28:57.285772 master-0 kubenswrapper[29612]: I0319 12:28:57.284773 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:57.285772 master-0 kubenswrapper[29612]: I0319 12:28:57.284834 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:57.285772 master-0 kubenswrapper[29612]: I0319 12:28:57.284851 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6dd57bb3-eafc-41af-a7a0-794cd736adaa-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:57.285772 master-0 kubenswrapper[29612]: I0319 12:28:57.284865 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4g76\" (UniqueName: \"kubernetes.io/projected/6dd57bb3-eafc-41af-a7a0-794cd736adaa-kube-api-access-g4g76\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:57.285772 master-0 kubenswrapper[29612]: I0319 12:28:57.284878 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dd57bb3-eafc-41af-a7a0-794cd736adaa-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:57.285772 master-0 kubenswrapper[29612]: I0319 12:28:57.284889 29612 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6dd57bb3-eafc-41af-a7a0-794cd736adaa-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 19 12:28:57.501341 master-0 kubenswrapper[29612]: I0319 12:28:57.501290 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-w5dht" Mar 19 12:28:57.502218 master-0 kubenswrapper[29612]: I0319 12:28:57.502170 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-w5dht" event={"ID":"6dd57bb3-eafc-41af-a7a0-794cd736adaa","Type":"ContainerDied","Data":"34300180a75ac4b39061bad716c65295bada1c35603bd353b1c466b6a01953e0"} Mar 19 12:28:57.502218 master-0 kubenswrapper[29612]: I0319 12:28:57.502214 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34300180a75ac4b39061bad716c65295bada1c35603bd353b1c466b6a01953e0" Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: I0319 12:28:58.203981 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-td9zp"] Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: E0319 12:28:58.204544 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerName="ironic-db-sync" Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: I0319 12:28:58.204582 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerName="ironic-db-sync" Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: E0319 12:28:58.204613 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerName="init" Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: I0319 12:28:58.204620 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerName="init" Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: I0319 12:28:58.204889 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" containerName="ironic-db-sync" Mar 19 12:28:58.206472 master-0 kubenswrapper[29612]: I0319 12:28:58.205535 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.288302 master-0 kubenswrapper[29612]: I0319 12:28:58.278586 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-td9zp"] Mar 19 12:28:58.384060 master-0 kubenswrapper[29612]: I0319 12:28:58.383969 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-operator-scripts\") pod \"ironic-inspector-db-create-td9zp\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.384060 master-0 kubenswrapper[29612]: I0319 12:28:58.384033 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97rjd\" (UniqueName: \"kubernetes.io/projected/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-kube-api-access-97rjd\") pod \"ironic-inspector-db-create-td9zp\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.522470 master-0 kubenswrapper[29612]: I0319 12:28:58.521285 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-operator-scripts\") pod \"ironic-inspector-db-create-td9zp\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.522470 master-0 kubenswrapper[29612]: I0319 12:28:58.521350 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97rjd\" (UniqueName: \"kubernetes.io/projected/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-kube-api-access-97rjd\") pod \"ironic-inspector-db-create-td9zp\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.523081 master-0 kubenswrapper[29612]: I0319 12:28:58.522660 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-operator-scripts\") pod \"ironic-inspector-db-create-td9zp\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.542824 master-0 kubenswrapper[29612]: I0319 12:28:58.542760 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-233a-account-create-update-8k2hj"] Mar 19 12:28:58.546147 master-0 kubenswrapper[29612]: I0319 12:28:58.544508 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.560591 master-0 kubenswrapper[29612]: I0319 12:28:58.554858 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 19 12:28:58.620022 master-0 kubenswrapper[29612]: I0319 12:28:58.616033 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-c6b86788c-dtf24"] Mar 19 12:28:58.623787 master-0 kubenswrapper[29612]: I0319 12:28:58.620894 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.623787 master-0 kubenswrapper[29612]: I0319 12:28:58.622953 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvgx\" (UniqueName: \"kubernetes.io/projected/3846413f-63e8-45f7-90a0-75230ff95190-kube-api-access-gsvgx\") pod \"ironic-inspector-233a-account-create-update-8k2hj\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.623787 master-0 kubenswrapper[29612]: I0319 12:28:58.623135 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3846413f-63e8-45f7-90a0-75230ff95190-operator-scripts\") pod \"ironic-inspector-233a-account-create-update-8k2hj\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.637315 master-0 kubenswrapper[29612]: I0319 12:28:58.637224 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 19 12:28:58.659376 master-0 kubenswrapper[29612]: I0319 12:28:58.655504 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97rjd\" (UniqueName: \"kubernetes.io/projected/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-kube-api-access-97rjd\") pod \"ironic-inspector-db-create-td9zp\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:58.709588 master-0 kubenswrapper[29612]: I0319 12:28:58.709519 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-c6b86788c-dtf24"] Mar 19 12:28:58.728820 master-0 kubenswrapper[29612]: I0319 12:28:58.728753 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-combined-ca-bundle\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.729036 master-0 kubenswrapper[29612]: I0319 12:28:58.728868 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-config\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.729036 master-0 kubenswrapper[29612]: I0319 12:28:58.728897 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht9hs\" (UniqueName: \"kubernetes.io/projected/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-kube-api-access-ht9hs\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.729036 master-0 kubenswrapper[29612]: I0319 12:28:58.728955 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3846413f-63e8-45f7-90a0-75230ff95190-operator-scripts\") pod \"ironic-inspector-233a-account-create-update-8k2hj\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.729036 master-0 kubenswrapper[29612]: I0319 12:28:58.728994 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvgx\" (UniqueName: \"kubernetes.io/projected/3846413f-63e8-45f7-90a0-75230ff95190-kube-api-access-gsvgx\") pod \"ironic-inspector-233a-account-create-update-8k2hj\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.730056 master-0 kubenswrapper[29612]: I0319 12:28:58.730005 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3846413f-63e8-45f7-90a0-75230ff95190-operator-scripts\") pod \"ironic-inspector-233a-account-create-update-8k2hj\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.744954 master-0 kubenswrapper[29612]: I0319 12:28:58.744887 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-233a-account-create-update-8k2hj"] Mar 19 12:28:58.746834 master-0 kubenswrapper[29612]: I0319 12:28:58.746755 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:58.827043 master-0 kubenswrapper[29612]: I0319 12:28:58.825530 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvgx\" (UniqueName: \"kubernetes.io/projected/3846413f-63e8-45f7-90a0-75230ff95190-kube-api-access-gsvgx\") pod \"ironic-inspector-233a-account-create-update-8k2hj\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:58.836852 master-0 kubenswrapper[29612]: I0319 12:28:58.836220 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-combined-ca-bundle\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.836852 master-0 kubenswrapper[29612]: I0319 12:28:58.836442 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-config\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.836852 master-0 kubenswrapper[29612]: I0319 12:28:58.836485 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht9hs\" (UniqueName: \"kubernetes.io/projected/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-kube-api-access-ht9hs\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.863872 master-0 kubenswrapper[29612]: I0319 12:28:58.863824 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-combined-ca-bundle\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.867278 master-0 kubenswrapper[29612]: I0319 12:28:58.866175 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-config\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:58.887438 master-0 kubenswrapper[29612]: I0319 12:28:58.887386 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:28:59.020202 master-0 kubenswrapper[29612]: I0319 12:28:59.020068 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht9hs\" (UniqueName: \"kubernetes.io/projected/7257edc5-e7bf-41f7-8c7d-97b4d4d83b64-kube-api-access-ht9hs\") pod \"ironic-neutron-agent-c6b86788c-dtf24\" (UID: \"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64\") " pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:59.024423 master-0 kubenswrapper[29612]: I0319 12:28:59.024018 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:28:59.037005 master-0 kubenswrapper[29612]: I0319 12:28:59.036936 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:28:59.089870 master-0 kubenswrapper[29612]: I0319 12:28:59.089715 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:28:59.089870 master-0 kubenswrapper[29612]: I0319 12:28:59.089763 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b89bc666f-bgkf6"] Mar 19 12:28:59.103950 master-0 kubenswrapper[29612]: I0319 12:28:59.103491 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-562cb-api-0" Mar 19 12:28:59.104177 master-0 kubenswrapper[29612]: I0319 12:28:59.104082 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.156539 master-0 kubenswrapper[29612]: I0319 12:28:59.152519 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b89bc666f-bgkf6"] Mar 19 12:28:59.268474 master-0 kubenswrapper[29612]: I0319 12:28:59.268190 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-swift-storage-0\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.268474 master-0 kubenswrapper[29612]: I0319 12:28:59.268304 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-config\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.268474 master-0 kubenswrapper[29612]: I0319 12:28:59.268409 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-sb\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.268474 master-0 kubenswrapper[29612]: I0319 12:28:59.268434 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-svc\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.268827 master-0 kubenswrapper[29612]: I0319 12:28:59.268541 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-nb\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.268827 master-0 kubenswrapper[29612]: I0319 12:28:59.268594 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rwr\" (UniqueName: \"kubernetes.io/projected/738eae96-f628-426c-ab32-da270c608a71-kube-api-access-r9rwr\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.287163 master-0 kubenswrapper[29612]: I0319 12:28:59.268845 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:28:59.307355 master-0 kubenswrapper[29612]: I0319 12:28:59.307296 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-6d6786d9f7-6fz2d"] Mar 19 12:28:59.311180 master-0 kubenswrapper[29612]: I0319 12:28:59.310114 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.332653 master-0 kubenswrapper[29612]: I0319 12:28:59.325259 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 19 12:28:59.332653 master-0 kubenswrapper[29612]: I0319 12:28:59.325474 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 19 12:28:59.332653 master-0 kubenswrapper[29612]: I0319 12:28:59.329893 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 19 12:28:59.332653 master-0 kubenswrapper[29612]: I0319 12:28:59.330265 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 12:28:59.359132 master-0 kubenswrapper[29612]: I0319 12:28:59.358828 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397348 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-swift-storage-0\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397428 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-config\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397518 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-sb\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397548 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-svc\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397621 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-custom\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397713 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-etc-podinfo\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397752 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mrjg\" (UniqueName: \"kubernetes.io/projected/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-kube-api-access-4mrjg\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397784 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-nb\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397818 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397847 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-scripts\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397873 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397915 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-logs\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397947 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rwr\" (UniqueName: \"kubernetes.io/projected/738eae96-f628-426c-ab32-da270c608a71-kube-api-access-r9rwr\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.397978 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-merged\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.398744 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-nb\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.399479 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-svc\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.399664 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-config\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.400271 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-swift-storage-0\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.401722 master-0 kubenswrapper[29612]: I0319 12:28:59.401687 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-sb\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503013 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-custom\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503132 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-etc-podinfo\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503182 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mrjg\" (UniqueName: \"kubernetes.io/projected/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-kube-api-access-4mrjg\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503235 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503257 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-scripts\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503283 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503662 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-logs\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.503711 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-merged\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.504554 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-merged\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.506345 master-0 kubenswrapper[29612]: I0319 12:28:59.504813 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-logs\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.510548 master-0 kubenswrapper[29612]: I0319 12:28:59.506816 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-etc-podinfo\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.515735 master-0 kubenswrapper[29612]: I0319 12:28:59.515482 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-custom\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.525462 master-0 kubenswrapper[29612]: I0319 12:28:59.523762 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-scripts\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.525462 master-0 kubenswrapper[29612]: I0319 12:28:59.523772 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.526180 master-0 kubenswrapper[29612]: I0319 12:28:59.525787 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.528392 master-0 kubenswrapper[29612]: I0319 12:28:59.527504 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rwr\" (UniqueName: \"kubernetes.io/projected/738eae96-f628-426c-ab32-da270c608a71-kube-api-access-r9rwr\") pod \"dnsmasq-dns-5b89bc666f-bgkf6\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.679385 master-0 kubenswrapper[29612]: I0319 12:28:59.677302 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mrjg\" (UniqueName: \"kubernetes.io/projected/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-kube-api-access-4mrjg\") pod \"ironic-6d6786d9f7-6fz2d\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.689737 master-0 kubenswrapper[29612]: I0319 12:28:59.689671 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6d6786d9f7-6fz2d"] Mar 19 12:28:59.697876 master-0 kubenswrapper[29612]: I0319 12:28:59.697700 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:28:59.861599 master-0 kubenswrapper[29612]: I0319 12:28:59.860602 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:28:59.897834 master-0 kubenswrapper[29612]: I0319 12:28:59.895295 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-77b78c7bbd-2n7mk"] Mar 19 12:28:59.902069 master-0 kubenswrapper[29612]: I0319 12:28:59.901998 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:28:59.931018 master-0 kubenswrapper[29612]: I0319 12:28:59.930946 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77b78c7bbd-2n7mk"] Mar 19 12:29:00.003844 master-0 kubenswrapper[29612]: I0319 12:28:59.995735 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-td9zp"] Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.031713 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-combined-ca-bundle\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.031993 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-internal-tls-certs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.032033 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-config-data\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.032335 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jm7\" (UniqueName: \"kubernetes.io/projected/9e989b5a-6822-4b61-a043-4cafc2a80381-kube-api-access-59jm7\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.032416 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-scripts\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.032434 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e989b5a-6822-4b61-a043-4cafc2a80381-logs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.050714 master-0 kubenswrapper[29612]: I0319 12:29:00.032461 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-public-tls-certs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.134626 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-internal-tls-certs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.134709 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-config-data\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.134790 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jm7\" (UniqueName: \"kubernetes.io/projected/9e989b5a-6822-4b61-a043-4cafc2a80381-kube-api-access-59jm7\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.134860 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-scripts\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.134883 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e989b5a-6822-4b61-a043-4cafc2a80381-logs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.134908 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-public-tls-certs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.136372 master-0 kubenswrapper[29612]: I0319 12:29:00.135007 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-combined-ca-bundle\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.142836 master-0 kubenswrapper[29612]: I0319 12:29:00.142683 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e989b5a-6822-4b61-a043-4cafc2a80381-logs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.145995 master-0 kubenswrapper[29612]: I0319 12:29:00.144502 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-combined-ca-bundle\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.167437 master-0 kubenswrapper[29612]: I0319 12:29:00.167158 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-internal-tls-certs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.171798 master-0 kubenswrapper[29612]: I0319 12:29:00.170495 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-public-tls-certs\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.186667 master-0 kubenswrapper[29612]: I0319 12:29:00.178343 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-config-data\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.186667 master-0 kubenswrapper[29612]: I0319 12:29:00.182920 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jm7\" (UniqueName: \"kubernetes.io/projected/9e989b5a-6822-4b61-a043-4cafc2a80381-kube-api-access-59jm7\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.228668 master-0 kubenswrapper[29612]: I0319 12:29:00.217008 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-c6b86788c-dtf24"] Mar 19 12:29:00.228668 master-0 kubenswrapper[29612]: I0319 12:29:00.221434 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e989b5a-6822-4b61-a043-4cafc2a80381-scripts\") pod \"placement-77b78c7bbd-2n7mk\" (UID: \"9e989b5a-6822-4b61-a043-4cafc2a80381\") " pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.447458 master-0 kubenswrapper[29612]: I0319 12:29:00.447155 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:00.487664 master-0 kubenswrapper[29612]: I0319 12:29:00.481829 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-233a-account-create-update-8k2hj"] Mar 19 12:29:00.554840 master-0 kubenswrapper[29612]: W0319 12:29:00.554768 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3846413f_63e8_45f7_90a0_75230ff95190.slice/crio-9939783d25e790643644b9fbb63a8c79fbbd1473739e32ad62513ce33d07b685 WatchSource:0}: Error finding container 9939783d25e790643644b9fbb63a8c79fbbd1473739e32ad62513ce33d07b685: Status 404 returned error can't find the container with id 9939783d25e790643644b9fbb63a8c79fbbd1473739e32ad62513ce33d07b685 Mar 19 12:29:00.735321 master-0 kubenswrapper[29612]: I0319 12:29:00.735263 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 19 12:29:00.790697 master-0 kubenswrapper[29612]: I0319 12:29:00.783883 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" event={"ID":"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64","Type":"ContainerStarted","Data":"e8c5ed93d75d53de325a5c55d8393a4371334281818edea997432369a36607da"} Mar 19 12:29:00.790697 master-0 kubenswrapper[29612]: I0319 12:29:00.783962 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 19 12:29:00.790697 master-0 kubenswrapper[29612]: I0319 12:29:00.783990 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-td9zp" event={"ID":"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b","Type":"ContainerStarted","Data":"3a4e2082df69e8600c9958d85017791df9bc5dc3484c73fca240969641b5ab78"} Mar 19 12:29:00.790697 master-0 kubenswrapper[29612]: I0319 12:29:00.784118 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 19 12:29:00.799783 master-0 kubenswrapper[29612]: I0319 12:29:00.794408 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b89bc666f-bgkf6"] Mar 19 12:29:00.799783 master-0 kubenswrapper[29612]: I0319 12:29:00.794516 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 19 12:29:00.799783 master-0 kubenswrapper[29612]: I0319 12:29:00.794871 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 19 12:29:00.799783 master-0 kubenswrapper[29612]: I0319 12:29:00.796612 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" event={"ID":"3846413f-63e8-45f7-90a0-75230ff95190","Type":"ContainerStarted","Data":"9939783d25e790643644b9fbb63a8c79fbbd1473739e32ad62513ce33d07b685"} Mar 19 12:29:00.869810 master-0 kubenswrapper[29612]: I0319 12:29:00.869720 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6d6786d9f7-6fz2d"] Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958203 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f7628b2-370c-427e-8c9e-2a2687577986\" (UniqueName: \"kubernetes.io/csi/topolvm.io^266494a9-acd1-4b73-b2a9-2ed0dd7b3472\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958296 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958362 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msx5v\" (UniqueName: \"kubernetes.io/projected/8ed03592-eec9-4173-8239-217452bc4579-kube-api-access-msx5v\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958443 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8ed03592-eec9-4173-8239-217452bc4579-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958489 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958540 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-config-data\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958566 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-scripts\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.071181 master-0 kubenswrapper[29612]: I0319 12:29:00.958674 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8ed03592-eec9-4173-8239-217452bc4579-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.071911 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8ed03592-eec9-4173-8239-217452bc4579-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072044 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f7628b2-370c-427e-8c9e-2a2687577986\" (UniqueName: \"kubernetes.io/csi/topolvm.io^266494a9-acd1-4b73-b2a9-2ed0dd7b3472\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072107 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072145 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msx5v\" (UniqueName: \"kubernetes.io/projected/8ed03592-eec9-4173-8239-217452bc4579-kube-api-access-msx5v\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072183 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8ed03592-eec9-4173-8239-217452bc4579-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072235 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072279 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-config-data\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.072406 master-0 kubenswrapper[29612]: I0319 12:29:01.072314 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-scripts\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.078147 master-0 kubenswrapper[29612]: I0319 12:29:01.074464 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8ed03592-eec9-4173-8239-217452bc4579-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.093934 master-0 kubenswrapper[29612]: I0319 12:29:01.093881 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-config-data\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.105332 master-0 kubenswrapper[29612]: I0319 12:29:01.105271 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-scripts\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.112051 master-0 kubenswrapper[29612]: I0319 12:29:01.111983 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8ed03592-eec9-4173-8239-217452bc4579-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.112333 master-0 kubenswrapper[29612]: I0319 12:29:01.112264 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.140255 master-0 kubenswrapper[29612]: I0319 12:29:01.139470 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ed03592-eec9-4173-8239-217452bc4579-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.140494 master-0 kubenswrapper[29612]: I0319 12:29:01.140293 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:29:01.140494 master-0 kubenswrapper[29612]: I0319 12:29:01.140374 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f7628b2-370c-427e-8c9e-2a2687577986\" (UniqueName: \"kubernetes.io/csi/topolvm.io^266494a9-acd1-4b73-b2a9-2ed0dd7b3472\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/8bd64b56dcbae17cc200c73fd9be8b77b9a451d687f66832e5e499de288741c3/globalmount\"" pod="openstack/ironic-conductor-0" Mar 19 12:29:01.163539 master-0 kubenswrapper[29612]: I0319 12:29:01.163501 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msx5v\" (UniqueName: \"kubernetes.io/projected/8ed03592-eec9-4173-8239-217452bc4579-kube-api-access-msx5v\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:01.457917 master-0 kubenswrapper[29612]: I0319 12:29:01.453414 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77b78c7bbd-2n7mk"] Mar 19 12:29:01.836902 master-0 kubenswrapper[29612]: I0319 12:29:01.834853 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" event={"ID":"3846413f-63e8-45f7-90a0-75230ff95190","Type":"ContainerStarted","Data":"328bcc5d8256d53424c53e4145ff47b65a56a85e6b84d529374edb9b101388b5"} Mar 19 12:29:01.840839 master-0 kubenswrapper[29612]: I0319 12:29:01.838686 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerStarted","Data":"3c152436dd8bdc133c4952dffd6d22a6532bea2c8564b73f1b33a7d104acb73b"} Mar 19 12:29:01.840895 master-0 kubenswrapper[29612]: I0319 12:29:01.840858 29612 generic.go:334] "Generic (PLEG): container finished" podID="738eae96-f628-426c-ab32-da270c608a71" containerID="0f561c467183903d1b594215f035dfdaceacf55d11f050e3c21a2a1619a75155" exitCode=0 Mar 19 12:29:01.840936 master-0 kubenswrapper[29612]: I0319 12:29:01.840921 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" event={"ID":"738eae96-f628-426c-ab32-da270c608a71","Type":"ContainerDied","Data":"0f561c467183903d1b594215f035dfdaceacf55d11f050e3c21a2a1619a75155"} Mar 19 12:29:01.840971 master-0 kubenswrapper[29612]: I0319 12:29:01.840949 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" event={"ID":"738eae96-f628-426c-ab32-da270c608a71","Type":"ContainerStarted","Data":"0fb4ab032112f87207d930d0261c9b661727cb7db8b3d5e4bae62e728ebe07aa"} Mar 19 12:29:01.845661 master-0 kubenswrapper[29612]: I0319 12:29:01.843200 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77b78c7bbd-2n7mk" event={"ID":"9e989b5a-6822-4b61-a043-4cafc2a80381","Type":"ContainerStarted","Data":"9cdb9577d23f9745fa47e41469c3b6c0d3a703ffe449e2ec1f3393e37b7fa81f"} Mar 19 12:29:01.849657 master-0 kubenswrapper[29612]: I0319 12:29:01.846271 29612 generic.go:334] "Generic (PLEG): container finished" podID="4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" containerID="fb6f5124a98351dccb9c053494c0da9c03ad23fbb448d2c8170f4bd821777c57" exitCode=0 Mar 19 12:29:01.849657 master-0 kubenswrapper[29612]: I0319 12:29:01.846297 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-td9zp" event={"ID":"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b","Type":"ContainerDied","Data":"fb6f5124a98351dccb9c053494c0da9c03ad23fbb448d2c8170f4bd821777c57"} Mar 19 12:29:01.914659 master-0 kubenswrapper[29612]: I0319 12:29:01.914056 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" podStartSLOduration=3.914039966 podStartE2EDuration="3.914039966s" podCreationTimestamp="2026-03-19 12:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:01.903486797 +0000 UTC m=+1149.217655818" watchObservedRunningTime="2026-03-19 12:29:01.914039966 +0000 UTC m=+1149.228208987" Mar 19 12:29:02.132882 master-0 kubenswrapper[29612]: I0319 12:29:02.129802 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-562cb-volume-lvm-iscsi-0" Mar 19 12:29:02.163663 master-0 kubenswrapper[29612]: I0319 12:29:02.157391 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-562cb-backup-0" Mar 19 12:29:02.777302 master-0 kubenswrapper[29612]: I0319 12:29:02.773873 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f7628b2-370c-427e-8c9e-2a2687577986\" (UniqueName: \"kubernetes.io/csi/topolvm.io^266494a9-acd1-4b73-b2a9-2ed0dd7b3472\") pod \"ironic-conductor-0\" (UID: \"8ed03592-eec9-4173-8239-217452bc4579\") " pod="openstack/ironic-conductor-0" Mar 19 12:29:02.868156 master-0 kubenswrapper[29612]: I0319 12:29:02.867968 29612 generic.go:334] "Generic (PLEG): container finished" podID="3846413f-63e8-45f7-90a0-75230ff95190" containerID="328bcc5d8256d53424c53e4145ff47b65a56a85e6b84d529374edb9b101388b5" exitCode=0 Mar 19 12:29:02.868156 master-0 kubenswrapper[29612]: I0319 12:29:02.868072 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" event={"ID":"3846413f-63e8-45f7-90a0-75230ff95190","Type":"ContainerDied","Data":"328bcc5d8256d53424c53e4145ff47b65a56a85e6b84d529374edb9b101388b5"} Mar 19 12:29:02.875214 master-0 kubenswrapper[29612]: I0319 12:29:02.875171 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" event={"ID":"738eae96-f628-426c-ab32-da270c608a71","Type":"ContainerStarted","Data":"d48ec6f1c78748f3ef00cd927a36d4f723e0f677a5c92ce684dccad1dc7b5539"} Mar 19 12:29:02.876079 master-0 kubenswrapper[29612]: I0319 12:29:02.876034 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:29:02.880381 master-0 kubenswrapper[29612]: I0319 12:29:02.880334 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77b78c7bbd-2n7mk" event={"ID":"9e989b5a-6822-4b61-a043-4cafc2a80381","Type":"ContainerStarted","Data":"972bc17097f838bde7a396dbae7eccc7d35c4ab8a75f829367fdc422883a941b"} Mar 19 12:29:02.933932 master-0 kubenswrapper[29612]: I0319 12:29:02.933771 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" podStartSLOduration=4.933755304 podStartE2EDuration="4.933755304s" podCreationTimestamp="2026-03-19 12:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:02.930025238 +0000 UTC m=+1150.244194259" watchObservedRunningTime="2026-03-19 12:29:02.933755304 +0000 UTC m=+1150.247924325" Mar 19 12:29:02.938950 master-0 kubenswrapper[29612]: I0319 12:29:02.938883 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 19 12:29:03.529397 master-0 kubenswrapper[29612]: I0319 12:29:03.529357 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:29:03.687346 master-0 kubenswrapper[29612]: I0319 12:29:03.684682 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-operator-scripts\") pod \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " Mar 19 12:29:03.703571 master-0 kubenswrapper[29612]: I0319 12:29:03.703508 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" (UID: "4c1ba2c7-2ca4-48e2-a467-0f795b64be2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:03.719022 master-0 kubenswrapper[29612]: I0319 12:29:03.712219 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97rjd\" (UniqueName: \"kubernetes.io/projected/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-kube-api-access-97rjd\") pod \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\" (UID: \"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b\") " Mar 19 12:29:03.719022 master-0 kubenswrapper[29612]: I0319 12:29:03.714156 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:03.719022 master-0 kubenswrapper[29612]: I0319 12:29:03.717428 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-kube-api-access-97rjd" (OuterVolumeSpecName: "kube-api-access-97rjd") pod "4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" (UID: "4c1ba2c7-2ca4-48e2-a467-0f795b64be2b"). InnerVolumeSpecName "kube-api-access-97rjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:03.820741 master-0 kubenswrapper[29612]: I0319 12:29:03.818678 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97rjd\" (UniqueName: \"kubernetes.io/projected/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b-kube-api-access-97rjd\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:03.905940 master-0 kubenswrapper[29612]: I0319 12:29:03.905366 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77b78c7bbd-2n7mk" event={"ID":"9e989b5a-6822-4b61-a043-4cafc2a80381","Type":"ContainerStarted","Data":"6c6703dec5e30d1957b26766d21a2b01e6b28c42839bfafbae6fee870d40b53b"} Mar 19 12:29:03.906912 master-0 kubenswrapper[29612]: I0319 12:29:03.906862 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:03.906912 master-0 kubenswrapper[29612]: I0319 12:29:03.906894 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:03.909070 master-0 kubenswrapper[29612]: I0319 12:29:03.909033 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-td9zp" event={"ID":"4c1ba2c7-2ca4-48e2-a467-0f795b64be2b","Type":"ContainerDied","Data":"3a4e2082df69e8600c9958d85017791df9bc5dc3484c73fca240969641b5ab78"} Mar 19 12:29:03.909070 master-0 kubenswrapper[29612]: I0319 12:29:03.909061 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a4e2082df69e8600c9958d85017791df9bc5dc3484c73fca240969641b5ab78" Mar 19 12:29:03.909190 master-0 kubenswrapper[29612]: I0319 12:29:03.909102 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-td9zp" Mar 19 12:29:03.915178 master-0 kubenswrapper[29612]: I0319 12:29:03.915136 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" event={"ID":"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64","Type":"ContainerStarted","Data":"d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26"} Mar 19 12:29:03.915471 master-0 kubenswrapper[29612]: I0319 12:29:03.915439 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:29:04.052465 master-0 kubenswrapper[29612]: I0319 12:29:04.052179 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-74f4876497-jcnql"] Mar 19 12:29:04.058585 master-0 kubenswrapper[29612]: E0319 12:29:04.058394 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" containerName="mariadb-database-create" Mar 19 12:29:04.058585 master-0 kubenswrapper[29612]: I0319 12:29:04.058455 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" containerName="mariadb-database-create" Mar 19 12:29:04.059253 master-0 kubenswrapper[29612]: I0319 12:29:04.059001 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" containerName="mariadb-database-create" Mar 19 12:29:04.068122 master-0 kubenswrapper[29612]: I0319 12:29:04.062185 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.075840 master-0 kubenswrapper[29612]: I0319 12:29:04.075770 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 19 12:29:04.076095 master-0 kubenswrapper[29612]: I0319 12:29:04.076058 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 19 12:29:04.132975 master-0 kubenswrapper[29612]: I0319 12:29:04.132884 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data-custom\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.133252 master-0 kubenswrapper[29612]: I0319 12:29:04.132986 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.133252 master-0 kubenswrapper[29612]: I0319 12:29:04.133023 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-internal-tls-certs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.133252 master-0 kubenswrapper[29612]: I0319 12:29:04.133064 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e5dadace-e3b9-4819-b064-e4ea49c98605-etc-podinfo\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.133252 master-0 kubenswrapper[29612]: I0319 12:29:04.133099 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5p2\" (UniqueName: \"kubernetes.io/projected/e5dadace-e3b9-4819-b064-e4ea49c98605-kube-api-access-4f5p2\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.156658 master-0 kubenswrapper[29612]: I0319 12:29:04.147098 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-74f4876497-jcnql"] Mar 19 12:29:04.189149 master-0 kubenswrapper[29612]: I0319 12:29:04.184929 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-scripts\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.189149 master-0 kubenswrapper[29612]: I0319 12:29:04.184992 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data-merged\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.189149 master-0 kubenswrapper[29612]: I0319 12:29:04.185049 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5dadace-e3b9-4819-b064-e4ea49c98605-logs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.189149 master-0 kubenswrapper[29612]: I0319 12:29:04.185089 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-combined-ca-bundle\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.189149 master-0 kubenswrapper[29612]: I0319 12:29:04.185128 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-public-tls-certs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.189149 master-0 kubenswrapper[29612]: I0319 12:29:04.187050 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-77b78c7bbd-2n7mk" podStartSLOduration=5.187021246 podStartE2EDuration="5.187021246s" podCreationTimestamp="2026-03-19 12:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:04.11639115 +0000 UTC m=+1151.430560171" watchObservedRunningTime="2026-03-19 12:29:04.187021246 +0000 UTC m=+1151.501190277" Mar 19 12:29:04.311434 master-0 kubenswrapper[29612]: I0319 12:29:04.311366 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data-custom\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311594 master-0 kubenswrapper[29612]: I0319 12:29:04.311441 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311594 master-0 kubenswrapper[29612]: I0319 12:29:04.311469 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-internal-tls-certs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311594 master-0 kubenswrapper[29612]: I0319 12:29:04.311501 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e5dadace-e3b9-4819-b064-e4ea49c98605-etc-podinfo\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311594 master-0 kubenswrapper[29612]: I0319 12:29:04.311529 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5p2\" (UniqueName: \"kubernetes.io/projected/e5dadace-e3b9-4819-b064-e4ea49c98605-kube-api-access-4f5p2\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311594 master-0 kubenswrapper[29612]: I0319 12:29:04.311590 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-scripts\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311820 master-0 kubenswrapper[29612]: I0319 12:29:04.311613 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data-merged\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311820 master-0 kubenswrapper[29612]: I0319 12:29:04.311650 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5dadace-e3b9-4819-b064-e4ea49c98605-logs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.311820 master-0 kubenswrapper[29612]: I0319 12:29:04.311673 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-combined-ca-bundle\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.312079 master-0 kubenswrapper[29612]: I0319 12:29:04.312034 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5dadace-e3b9-4819-b064-e4ea49c98605-logs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.312079 master-0 kubenswrapper[29612]: I0319 12:29:04.312070 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-public-tls-certs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.334201 master-0 kubenswrapper[29612]: I0319 12:29:04.323590 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data-merged\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.334201 master-0 kubenswrapper[29612]: I0319 12:29:04.332316 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-public-tls-certs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.349914 master-0 kubenswrapper[29612]: I0319 12:29:04.346388 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e5dadace-e3b9-4819-b064-e4ea49c98605-etc-podinfo\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.360382 master-0 kubenswrapper[29612]: I0319 12:29:04.359578 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-internal-tls-certs\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.375686 master-0 kubenswrapper[29612]: I0319 12:29:04.373060 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.387926 master-0 kubenswrapper[29612]: I0319 12:29:04.387868 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-combined-ca-bundle\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.393510 master-0 kubenswrapper[29612]: I0319 12:29:04.389490 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-scripts\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.403031 master-0 kubenswrapper[29612]: I0319 12:29:04.401655 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5dadace-e3b9-4819-b064-e4ea49c98605-config-data-custom\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.445390 master-0 kubenswrapper[29612]: I0319 12:29:04.442821 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" podStartSLOduration=3.5749623809999997 podStartE2EDuration="6.442796605s" podCreationTimestamp="2026-03-19 12:28:58 +0000 UTC" firstStartedPulling="2026-03-19 12:29:00.278556552 +0000 UTC m=+1147.592725573" lastFinishedPulling="2026-03-19 12:29:03.146390776 +0000 UTC m=+1150.460559797" observedRunningTime="2026-03-19 12:29:04.373383072 +0000 UTC m=+1151.687552093" watchObservedRunningTime="2026-03-19 12:29:04.442796605 +0000 UTC m=+1151.756965636" Mar 19 12:29:04.486770 master-0 kubenswrapper[29612]: I0319 12:29:04.484393 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5p2\" (UniqueName: \"kubernetes.io/projected/e5dadace-e3b9-4819-b064-e4ea49c98605-kube-api-access-4f5p2\") pod \"ironic-74f4876497-jcnql\" (UID: \"e5dadace-e3b9-4819-b064-e4ea49c98605\") " pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.565974 master-0 kubenswrapper[29612]: I0319 12:29:04.564220 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 19 12:29:04.729661 master-0 kubenswrapper[29612]: I0319 12:29:04.726229 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:04.841925 master-0 kubenswrapper[29612]: I0319 12:29:04.841796 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:29:04.974673 master-0 kubenswrapper[29612]: I0319 12:29:04.969731 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5b4b84764-w6p2r" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.128.0.222:9696/\": dial tcp 10.128.0.222:9696: connect: connection refused" Mar 19 12:29:04.974673 master-0 kubenswrapper[29612]: I0319 12:29:04.970583 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" event={"ID":"3846413f-63e8-45f7-90a0-75230ff95190","Type":"ContainerDied","Data":"9939783d25e790643644b9fbb63a8c79fbbd1473739e32ad62513ce33d07b685"} Mar 19 12:29:04.974673 master-0 kubenswrapper[29612]: I0319 12:29:04.970613 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9939783d25e790643644b9fbb63a8c79fbbd1473739e32ad62513ce33d07b685" Mar 19 12:29:04.979663 master-0 kubenswrapper[29612]: I0319 12:29:04.977526 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"5a4bccd82c55e66782db327bbb830494b497ee3519acec907a21a6a979a06d9a"} Mar 19 12:29:04.989349 master-0 kubenswrapper[29612]: I0319 12:29:04.986005 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-562cb-scheduler-0" Mar 19 12:29:05.045738 master-0 kubenswrapper[29612]: I0319 12:29:05.042377 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:29:05.237726 master-0 kubenswrapper[29612]: I0319 12:29:05.237606 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3846413f-63e8-45f7-90a0-75230ff95190-operator-scripts\") pod \"3846413f-63e8-45f7-90a0-75230ff95190\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " Mar 19 12:29:05.237726 master-0 kubenswrapper[29612]: I0319 12:29:05.237686 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsvgx\" (UniqueName: \"kubernetes.io/projected/3846413f-63e8-45f7-90a0-75230ff95190-kube-api-access-gsvgx\") pod \"3846413f-63e8-45f7-90a0-75230ff95190\" (UID: \"3846413f-63e8-45f7-90a0-75230ff95190\") " Mar 19 12:29:05.244902 master-0 kubenswrapper[29612]: I0319 12:29:05.239365 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3846413f-63e8-45f7-90a0-75230ff95190-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3846413f-63e8-45f7-90a0-75230ff95190" (UID: "3846413f-63e8-45f7-90a0-75230ff95190"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:05.244902 master-0 kubenswrapper[29612]: I0319 12:29:05.242228 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3846413f-63e8-45f7-90a0-75230ff95190-kube-api-access-gsvgx" (OuterVolumeSpecName: "kube-api-access-gsvgx") pod "3846413f-63e8-45f7-90a0-75230ff95190" (UID: "3846413f-63e8-45f7-90a0-75230ff95190"). InnerVolumeSpecName "kube-api-access-gsvgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:05.363721 master-0 kubenswrapper[29612]: I0319 12:29:05.363580 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3846413f-63e8-45f7-90a0-75230ff95190-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:05.363721 master-0 kubenswrapper[29612]: I0319 12:29:05.363653 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsvgx\" (UniqueName: \"kubernetes.io/projected/3846413f-63e8-45f7-90a0-75230ff95190-kube-api-access-gsvgx\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:05.611019 master-0 kubenswrapper[29612]: I0319 12:29:05.610967 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-74f4876497-jcnql"] Mar 19 12:29:06.022384 master-0 kubenswrapper[29612]: I0319 12:29:06.022260 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"d9a6dfd4111deae6b87568d4cbf36695c27de1fe9e6a67dd0a1fa54873062812"} Mar 19 12:29:06.027789 master-0 kubenswrapper[29612]: I0319 12:29:06.027750 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74f4876497-jcnql" event={"ID":"e5dadace-e3b9-4819-b064-e4ea49c98605","Type":"ContainerStarted","Data":"3b81cef6090bc37d5728d09bd669498bcc2d048e26518f9e84d71db2e5dcef0f"} Mar 19 12:29:06.028080 master-0 kubenswrapper[29612]: I0319 12:29:06.028067 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-233a-account-create-update-8k2hj" Mar 19 12:29:07.710659 master-0 kubenswrapper[29612]: I0319 12:29:07.710217 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b858fd547-mflsp" Mar 19 12:29:08.067649 master-0 kubenswrapper[29612]: I0319 12:29:08.067572 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerStarted","Data":"6d9a0090ce988131f4855c5b4248cb5067f742ec7925c02c435a1445857223af"} Mar 19 12:29:08.082242 master-0 kubenswrapper[29612]: I0319 12:29:08.082192 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74f4876497-jcnql" event={"ID":"e5dadace-e3b9-4819-b064-e4ea49c98605","Type":"ContainerStarted","Data":"e78b75cc49209e2ae7b3c9256f5dc7be8812d746c4415b616f7420a9fadf718c"} Mar 19 12:29:08.281793 master-0 kubenswrapper[29612]: I0319 12:29:08.280872 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 12:29:08.281793 master-0 kubenswrapper[29612]: E0319 12:29:08.281532 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3846413f-63e8-45f7-90a0-75230ff95190" containerName="mariadb-account-create-update" Mar 19 12:29:08.281793 master-0 kubenswrapper[29612]: I0319 12:29:08.281552 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="3846413f-63e8-45f7-90a0-75230ff95190" containerName="mariadb-account-create-update" Mar 19 12:29:08.282122 master-0 kubenswrapper[29612]: I0319 12:29:08.281867 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="3846413f-63e8-45f7-90a0-75230ff95190" containerName="mariadb-account-create-update" Mar 19 12:29:08.282975 master-0 kubenswrapper[29612]: I0319 12:29:08.282943 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 12:29:08.285171 master-0 kubenswrapper[29612]: I0319 12:29:08.285131 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 12:29:08.285361 master-0 kubenswrapper[29612]: I0319 12:29:08.285345 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 12:29:08.315778 master-0 kubenswrapper[29612]: I0319 12:29:08.315718 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 12:29:08.413503 master-0 kubenswrapper[29612]: I0319 12:29:08.413456 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-openstack-config\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.413816 master-0 kubenswrapper[29612]: I0319 12:29:08.413624 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.413816 master-0 kubenswrapper[29612]: I0319 12:29:08.413774 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4fd\" (UniqueName: \"kubernetes.io/projected/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-kube-api-access-nw4fd\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.413816 master-0 kubenswrapper[29612]: I0319 12:29:08.413806 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.518141 master-0 kubenswrapper[29612]: I0319 12:29:08.517728 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4fd\" (UniqueName: \"kubernetes.io/projected/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-kube-api-access-nw4fd\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.518141 master-0 kubenswrapper[29612]: I0319 12:29:08.517798 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.518141 master-0 kubenswrapper[29612]: I0319 12:29:08.518042 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-openstack-config\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.518479 master-0 kubenswrapper[29612]: I0319 12:29:08.518266 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.519433 master-0 kubenswrapper[29612]: I0319 12:29:08.519393 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-openstack-config\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.523526 master-0 kubenswrapper[29612]: I0319 12:29:08.523474 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.523817 master-0 kubenswrapper[29612]: I0319 12:29:08.523781 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.560701 master-0 kubenswrapper[29612]: I0319 12:29:08.558930 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4fd\" (UniqueName: \"kubernetes.io/projected/0011ebb3-0c56-4a97-a5cd-ef8560ff56e7-kube-api-access-nw4fd\") pod \"openstackclient\" (UID: \"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7\") " pod="openstack/openstackclient" Mar 19 12:29:08.740347 master-0 kubenswrapper[29612]: I0319 12:29:08.739942 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 12:29:09.040360 master-0 kubenswrapper[29612]: E0319 12:29:09.040282 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" cmd=["/bin/true"] Mar 19 12:29:09.040690 master-0 kubenswrapper[29612]: E0319 12:29:09.040441 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" cmd=["/bin/true"] Mar 19 12:29:09.043147 master-0 kubenswrapper[29612]: E0319 12:29:09.043101 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" cmd=["/bin/true"] Mar 19 12:29:09.043911 master-0 kubenswrapper[29612]: E0319 12:29:09.043806 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" cmd=["/bin/true"] Mar 19 12:29:09.044101 master-0 kubenswrapper[29612]: E0319 12:29:09.043994 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" cmd=["/bin/true"] Mar 19 12:29:09.044169 master-0 kubenswrapper[29612]: E0319 12:29:09.044091 29612 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" probeType="Readiness" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" podUID="7257edc5-e7bf-41f7-8c7d-97b4d4d83b64" containerName="ironic-neutron-agent" Mar 19 12:29:09.044304 master-0 kubenswrapper[29612]: E0319 12:29:09.044256 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" cmd=["/bin/true"] Mar 19 12:29:09.044375 master-0 kubenswrapper[29612]: E0319 12:29:09.044308 29612 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26 is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" podUID="7257edc5-e7bf-41f7-8c7d-97b4d4d83b64" containerName="ironic-neutron-agent" Mar 19 12:29:09.118939 master-0 kubenswrapper[29612]: I0319 12:29:09.118790 29612 generic.go:334] "Generic (PLEG): container finished" podID="7257edc5-e7bf-41f7-8c7d-97b4d4d83b64" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" exitCode=1 Mar 19 12:29:09.119660 master-0 kubenswrapper[29612]: I0319 12:29:09.119549 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" event={"ID":"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64","Type":"ContainerDied","Data":"d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26"} Mar 19 12:29:09.121046 master-0 kubenswrapper[29612]: I0319 12:29:09.121009 29612 scope.go:117] "RemoveContainer" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" Mar 19 12:29:09.123340 master-0 kubenswrapper[29612]: I0319 12:29:09.122065 29612 generic.go:334] "Generic (PLEG): container finished" podID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerID="6d9a0090ce988131f4855c5b4248cb5067f742ec7925c02c435a1445857223af" exitCode=0 Mar 19 12:29:09.123340 master-0 kubenswrapper[29612]: I0319 12:29:09.122345 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerDied","Data":"6d9a0090ce988131f4855c5b4248cb5067f742ec7925c02c435a1445857223af"} Mar 19 12:29:09.128440 master-0 kubenswrapper[29612]: I0319 12:29:09.128383 29612 generic.go:334] "Generic (PLEG): container finished" podID="e5dadace-e3b9-4819-b064-e4ea49c98605" containerID="e78b75cc49209e2ae7b3c9256f5dc7be8812d746c4415b616f7420a9fadf718c" exitCode=0 Mar 19 12:29:09.128530 master-0 kubenswrapper[29612]: I0319 12:29:09.128464 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74f4876497-jcnql" event={"ID":"e5dadace-e3b9-4819-b064-e4ea49c98605","Type":"ContainerDied","Data":"e78b75cc49209e2ae7b3c9256f5dc7be8812d746c4415b616f7420a9fadf718c"} Mar 19 12:29:09.318289 master-0 kubenswrapper[29612]: I0319 12:29:09.318232 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 12:29:09.867185 master-0 kubenswrapper[29612]: I0319 12:29:09.866772 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:29:10.029895 master-0 kubenswrapper[29612]: I0319 12:29:10.029827 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78975f67c-cdl4g"] Mar 19 12:29:10.030730 master-0 kubenswrapper[29612]: I0319 12:29:10.030179 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" podUID="5681835f-5e92-40aa-a647-62b3914261ed" containerName="dnsmasq-dns" containerID="cri-o://1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba" gracePeriod=10 Mar 19 12:29:10.194024 master-0 kubenswrapper[29612]: I0319 12:29:10.193955 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" event={"ID":"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64","Type":"ContainerStarted","Data":"b7555116d7d97d5e39ab90913282f9e6d09ee7bc33bd088a6b4197346688856f"} Mar 19 12:29:10.195503 master-0 kubenswrapper[29612]: I0319 12:29:10.195441 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:29:10.201593 master-0 kubenswrapper[29612]: I0319 12:29:10.199531 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7","Type":"ContainerStarted","Data":"f67c003ed0e61cf1362ac8c7c8e0baef40981cd651884f3cc6441a85bc8093b4"} Mar 19 12:29:10.214997 master-0 kubenswrapper[29612]: I0319 12:29:10.214240 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerStarted","Data":"cc1ff77522607b7a7436cf093b977681ff6ae854292bb9e512d4d424092b55df"} Mar 19 12:29:10.214997 master-0 kubenswrapper[29612]: I0319 12:29:10.214300 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerStarted","Data":"436b9e409e3c03f1804e33525028cdf07ca820feb1e1199bc639f5f9b01f3e38"} Mar 19 12:29:10.216739 master-0 kubenswrapper[29612]: I0319 12:29:10.215674 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:29:10.225191 master-0 kubenswrapper[29612]: I0319 12:29:10.225109 29612 generic.go:334] "Generic (PLEG): container finished" podID="8ed03592-eec9-4173-8239-217452bc4579" containerID="d9a6dfd4111deae6b87568d4cbf36695c27de1fe9e6a67dd0a1fa54873062812" exitCode=0 Mar 19 12:29:10.225339 master-0 kubenswrapper[29612]: I0319 12:29:10.225228 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerDied","Data":"d9a6dfd4111deae6b87568d4cbf36695c27de1fe9e6a67dd0a1fa54873062812"} Mar 19 12:29:10.242202 master-0 kubenswrapper[29612]: I0319 12:29:10.240170 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74f4876497-jcnql" event={"ID":"e5dadace-e3b9-4819-b064-e4ea49c98605","Type":"ContainerStarted","Data":"8e7c2d127a102601ea531584c6c3f2577e781f00d8743b9d6df18ad3e5f1c36b"} Mar 19 12:29:10.242202 master-0 kubenswrapper[29612]: I0319 12:29:10.241436 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:10.256834 master-0 kubenswrapper[29612]: I0319 12:29:10.256747 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-6d6786d9f7-6fz2d" podStartSLOduration=5.127618448 podStartE2EDuration="11.256723967s" podCreationTimestamp="2026-03-19 12:28:59 +0000 UTC" firstStartedPulling="2026-03-19 12:29:01.116887024 +0000 UTC m=+1148.431056045" lastFinishedPulling="2026-03-19 12:29:07.245992543 +0000 UTC m=+1154.560161564" observedRunningTime="2026-03-19 12:29:10.251031466 +0000 UTC m=+1157.565200507" watchObservedRunningTime="2026-03-19 12:29:10.256723967 +0000 UTC m=+1157.570892998" Mar 19 12:29:10.317899 master-0 kubenswrapper[29612]: I0319 12:29:10.315496 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-74f4876497-jcnql" podStartSLOduration=5.709625124 podStartE2EDuration="7.315475537s" podCreationTimestamp="2026-03-19 12:29:03 +0000 UTC" firstStartedPulling="2026-03-19 12:29:05.644117303 +0000 UTC m=+1152.958286324" lastFinishedPulling="2026-03-19 12:29:07.249967716 +0000 UTC m=+1154.564136737" observedRunningTime="2026-03-19 12:29:10.292050931 +0000 UTC m=+1157.606219952" watchObservedRunningTime="2026-03-19 12:29:10.315475537 +0000 UTC m=+1157.629644558" Mar 19 12:29:10.690623 master-0 kubenswrapper[29612]: E0319 12:29:10.689493 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f4fdfe1_2708_47d8_a72f_1a9f4a03caa2.slice/crio-cc1ff77522607b7a7436cf093b977681ff6ae854292bb9e512d4d424092b55df.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:29:10.981793 master-0 kubenswrapper[29612]: I0319 12:29:10.981668 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:29:10.997720 master-0 kubenswrapper[29612]: I0319 12:29:10.997384 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-config\") pod \"5681835f-5e92-40aa-a647-62b3914261ed\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " Mar 19 12:29:10.997720 master-0 kubenswrapper[29612]: I0319 12:29:10.997588 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-sb\") pod \"5681835f-5e92-40aa-a647-62b3914261ed\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " Mar 19 12:29:10.997720 master-0 kubenswrapper[29612]: I0319 12:29:10.997620 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-swift-storage-0\") pod \"5681835f-5e92-40aa-a647-62b3914261ed\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " Mar 19 12:29:10.998039 master-0 kubenswrapper[29612]: I0319 12:29:10.997811 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-svc\") pod \"5681835f-5e92-40aa-a647-62b3914261ed\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " Mar 19 12:29:10.998039 master-0 kubenswrapper[29612]: I0319 12:29:10.997926 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gc9m\" (UniqueName: \"kubernetes.io/projected/5681835f-5e92-40aa-a647-62b3914261ed-kube-api-access-4gc9m\") pod \"5681835f-5e92-40aa-a647-62b3914261ed\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " Mar 19 12:29:10.998039 master-0 kubenswrapper[29612]: I0319 12:29:10.997964 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-nb\") pod \"5681835f-5e92-40aa-a647-62b3914261ed\" (UID: \"5681835f-5e92-40aa-a647-62b3914261ed\") " Mar 19 12:29:11.027937 master-0 kubenswrapper[29612]: I0319 12:29:11.026980 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5681835f-5e92-40aa-a647-62b3914261ed-kube-api-access-4gc9m" (OuterVolumeSpecName: "kube-api-access-4gc9m") pod "5681835f-5e92-40aa-a647-62b3914261ed" (UID: "5681835f-5e92-40aa-a647-62b3914261ed"). InnerVolumeSpecName "kube-api-access-4gc9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:11.083472 master-0 kubenswrapper[29612]: I0319 12:29:11.083396 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5681835f-5e92-40aa-a647-62b3914261ed" (UID: "5681835f-5e92-40aa-a647-62b3914261ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:11.103958 master-0 kubenswrapper[29612]: I0319 12:29:11.103890 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gc9m\" (UniqueName: \"kubernetes.io/projected/5681835f-5e92-40aa-a647-62b3914261ed-kube-api-access-4gc9m\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:11.103958 master-0 kubenswrapper[29612]: I0319 12:29:11.103954 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:11.108420 master-0 kubenswrapper[29612]: I0319 12:29:11.108171 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5681835f-5e92-40aa-a647-62b3914261ed" (UID: "5681835f-5e92-40aa-a647-62b3914261ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:11.108420 master-0 kubenswrapper[29612]: I0319 12:29:11.108226 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-config" (OuterVolumeSpecName: "config") pod "5681835f-5e92-40aa-a647-62b3914261ed" (UID: "5681835f-5e92-40aa-a647-62b3914261ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:11.111577 master-0 kubenswrapper[29612]: I0319 12:29:11.111524 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5681835f-5e92-40aa-a647-62b3914261ed" (UID: "5681835f-5e92-40aa-a647-62b3914261ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:11.175657 master-0 kubenswrapper[29612]: I0319 12:29:11.168205 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5681835f-5e92-40aa-a647-62b3914261ed" (UID: "5681835f-5e92-40aa-a647-62b3914261ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:11.217838 master-0 kubenswrapper[29612]: I0319 12:29:11.206329 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:11.217838 master-0 kubenswrapper[29612]: I0319 12:29:11.206372 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:11.217838 master-0 kubenswrapper[29612]: I0319 12:29:11.206385 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:11.217838 master-0 kubenswrapper[29612]: I0319 12:29:11.206394 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5681835f-5e92-40aa-a647-62b3914261ed-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:11.269725 master-0 kubenswrapper[29612]: I0319 12:29:11.263237 29612 generic.go:334] "Generic (PLEG): container finished" podID="5681835f-5e92-40aa-a647-62b3914261ed" containerID="1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba" exitCode=0 Mar 19 12:29:11.269725 master-0 kubenswrapper[29612]: I0319 12:29:11.263318 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" Mar 19 12:29:11.269725 master-0 kubenswrapper[29612]: I0319 12:29:11.263284 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" event={"ID":"5681835f-5e92-40aa-a647-62b3914261ed","Type":"ContainerDied","Data":"1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba"} Mar 19 12:29:11.269725 master-0 kubenswrapper[29612]: I0319 12:29:11.263470 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78975f67c-cdl4g" event={"ID":"5681835f-5e92-40aa-a647-62b3914261ed","Type":"ContainerDied","Data":"9624563e3e9508146f8a6613394581e0bac266bec8960f3ed41d3079b3ab073c"} Mar 19 12:29:11.269725 master-0 kubenswrapper[29612]: I0319 12:29:11.263524 29612 scope.go:117] "RemoveContainer" containerID="1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba" Mar 19 12:29:11.275693 master-0 kubenswrapper[29612]: I0319 12:29:11.275366 29612 generic.go:334] "Generic (PLEG): container finished" podID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerID="cc1ff77522607b7a7436cf093b977681ff6ae854292bb9e512d4d424092b55df" exitCode=1 Mar 19 12:29:11.275693 master-0 kubenswrapper[29612]: I0319 12:29:11.275433 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerDied","Data":"cc1ff77522607b7a7436cf093b977681ff6ae854292bb9e512d4d424092b55df"} Mar 19 12:29:11.276775 master-0 kubenswrapper[29612]: I0319 12:29:11.276651 29612 scope.go:117] "RemoveContainer" containerID="cc1ff77522607b7a7436cf093b977681ff6ae854292bb9e512d4d424092b55df" Mar 19 12:29:11.287334 master-0 kubenswrapper[29612]: I0319 12:29:11.286474 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-74f4876497-jcnql" event={"ID":"e5dadace-e3b9-4819-b064-e4ea49c98605","Type":"ContainerStarted","Data":"9a14951944dc40fae74094c0d14a112b2e63a63396604a94f82424cdf6eb97de"} Mar 19 12:29:11.412220 master-0 kubenswrapper[29612]: I0319 12:29:11.411576 29612 scope.go:117] "RemoveContainer" containerID="3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626" Mar 19 12:29:11.419967 master-0 kubenswrapper[29612]: I0319 12:29:11.419908 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78975f67c-cdl4g"] Mar 19 12:29:11.431900 master-0 kubenswrapper[29612]: I0319 12:29:11.431848 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78975f67c-cdl4g"] Mar 19 12:29:11.512629 master-0 kubenswrapper[29612]: I0319 12:29:11.512203 29612 scope.go:117] "RemoveContainer" containerID="1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba" Mar 19 12:29:11.514311 master-0 kubenswrapper[29612]: E0319 12:29:11.514279 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba\": container with ID starting with 1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba not found: ID does not exist" containerID="1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba" Mar 19 12:29:11.514377 master-0 kubenswrapper[29612]: I0319 12:29:11.514321 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba"} err="failed to get container status \"1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba\": rpc error: code = NotFound desc = could not find container \"1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba\": container with ID starting with 1525f8a23321116adbbdfede41be2bb4754ca1afedfa886f79c59c8f7acfa9ba not found: ID does not exist" Mar 19 12:29:11.514377 master-0 kubenswrapper[29612]: I0319 12:29:11.514345 29612 scope.go:117] "RemoveContainer" containerID="3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626" Mar 19 12:29:11.518763 master-0 kubenswrapper[29612]: E0319 12:29:11.517817 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626\": container with ID starting with 3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626 not found: ID does not exist" containerID="3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626" Mar 19 12:29:11.518763 master-0 kubenswrapper[29612]: I0319 12:29:11.517850 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626"} err="failed to get container status \"3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626\": rpc error: code = NotFound desc = could not find container \"3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626\": container with ID starting with 3e67242e5a68c1795e7eaa7fc4f89f3855e8ca7659fdf9f419542f6d643d5626 not found: ID does not exist" Mar 19 12:29:12.307034 master-0 kubenswrapper[29612]: I0319 12:29:12.306893 29612 generic.go:334] "Generic (PLEG): container finished" podID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerID="f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575" exitCode=1 Mar 19 12:29:12.309435 master-0 kubenswrapper[29612]: I0319 12:29:12.309312 29612 scope.go:117] "RemoveContainer" containerID="f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575" Mar 19 12:29:12.309813 master-0 kubenswrapper[29612]: I0319 12:29:12.309721 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerDied","Data":"f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575"} Mar 19 12:29:12.309813 master-0 kubenswrapper[29612]: I0319 12:29:12.309786 29612 scope.go:117] "RemoveContainer" containerID="cc1ff77522607b7a7436cf093b977681ff6ae854292bb9e512d4d424092b55df" Mar 19 12:29:12.310423 master-0 kubenswrapper[29612]: E0319 12:29:12.310376 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-6d6786d9f7-6fz2d_openstack(2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2)\"" pod="openstack/ironic-6d6786d9f7-6fz2d" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" Mar 19 12:29:12.933458 master-0 kubenswrapper[29612]: I0319 12:29:12.929169 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5681835f-5e92-40aa-a647-62b3914261ed" path="/var/lib/kubelet/pods/5681835f-5e92-40aa-a647-62b3914261ed/volumes" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: I0319 12:29:13.328883 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-rnrf5"] Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: E0319 12:29:13.329674 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5681835f-5e92-40aa-a647-62b3914261ed" containerName="init" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: I0319 12:29:13.329772 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="5681835f-5e92-40aa-a647-62b3914261ed" containerName="init" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: E0319 12:29:13.329832 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5681835f-5e92-40aa-a647-62b3914261ed" containerName="dnsmasq-dns" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: I0319 12:29:13.329840 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="5681835f-5e92-40aa-a647-62b3914261ed" containerName="dnsmasq-dns" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: I0319 12:29:13.330063 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="5681835f-5e92-40aa-a647-62b3914261ed" containerName="dnsmasq-dns" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: I0319 12:29:13.331097 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.344185 master-0 kubenswrapper[29612]: I0319 12:29:13.336316 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 19 12:29:13.359419 master-0 kubenswrapper[29612]: I0319 12:29:13.355705 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-rnrf5"] Mar 19 12:29:13.363059 master-0 kubenswrapper[29612]: I0319 12:29:13.363002 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 19 12:29:13.387174 master-0 kubenswrapper[29612]: I0319 12:29:13.387120 29612 generic.go:334] "Generic (PLEG): container finished" podID="7257edc5-e7bf-41f7-8c7d-97b4d4d83b64" containerID="b7555116d7d97d5e39ab90913282f9e6d09ee7bc33bd088a6b4197346688856f" exitCode=1 Mar 19 12:29:13.387174 master-0 kubenswrapper[29612]: I0319 12:29:13.387174 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" event={"ID":"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64","Type":"ContainerDied","Data":"b7555116d7d97d5e39ab90913282f9e6d09ee7bc33bd088a6b4197346688856f"} Mar 19 12:29:13.387451 master-0 kubenswrapper[29612]: I0319 12:29:13.387218 29612 scope.go:117] "RemoveContainer" containerID="d114bc86e8619cbadcd965e512aa0cbf51dc99f061b0cb6bd9c7b76d70115f26" Mar 19 12:29:13.407169 master-0 kubenswrapper[29612]: I0319 12:29:13.407103 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-config\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.407368 master-0 kubenswrapper[29612]: I0319 12:29:13.407176 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-combined-ca-bundle\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.407368 master-0 kubenswrapper[29612]: I0319 12:29:13.407216 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.407368 master-0 kubenswrapper[29612]: I0319 12:29:13.407267 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.407583 master-0 kubenswrapper[29612]: I0319 12:29:13.407566 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-scripts\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.407742 master-0 kubenswrapper[29612]: I0319 12:29:13.407721 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-etc-podinfo\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.407916 master-0 kubenswrapper[29612]: I0319 12:29:13.407897 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sflxl\" (UniqueName: \"kubernetes.io/projected/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-kube-api-access-sflxl\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.412735 master-0 kubenswrapper[29612]: I0319 12:29:13.412667 29612 scope.go:117] "RemoveContainer" containerID="f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575" Mar 19 12:29:13.421108 master-0 kubenswrapper[29612]: E0319 12:29:13.416680 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-6d6786d9f7-6fz2d_openstack(2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2)\"" pod="openstack/ironic-6d6786d9f7-6fz2d" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" Mar 19 12:29:13.426327 master-0 kubenswrapper[29612]: I0319 12:29:13.426256 29612 scope.go:117] "RemoveContainer" containerID="b7555116d7d97d5e39ab90913282f9e6d09ee7bc33bd088a6b4197346688856f" Mar 19 12:29:13.427223 master-0 kubenswrapper[29612]: E0319 12:29:13.427156 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c6b86788c-dtf24_openstack(7257edc5-e7bf-41f7-8c7d-97b4d4d83b64)\"" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" podUID="7257edc5-e7bf-41f7-8c7d-97b4d4d83b64" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.509425 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-etc-podinfo\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.509563 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sflxl\" (UniqueName: \"kubernetes.io/projected/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-kube-api-access-sflxl\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.509933 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-config\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.509970 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-combined-ca-bundle\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.509999 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.510035 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.510137 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-scripts\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.511154 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.511413 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.515464 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-scripts\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.518236 master-0 kubenswrapper[29612]: I0319 12:29:13.516891 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-config\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.520093 master-0 kubenswrapper[29612]: I0319 12:29:13.520004 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-etc-podinfo\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.524529 master-0 kubenswrapper[29612]: I0319 12:29:13.524476 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-combined-ca-bundle\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.532167 master-0 kubenswrapper[29612]: I0319 12:29:13.532107 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sflxl\" (UniqueName: \"kubernetes.io/projected/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-kube-api-access-sflxl\") pod \"ironic-inspector-db-sync-rnrf5\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:13.686249 master-0 kubenswrapper[29612]: I0319 12:29:13.686195 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:14.037917 master-0 kubenswrapper[29612]: I0319 12:29:14.037832 29612 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:29:14.106799 master-0 kubenswrapper[29612]: I0319 12:29:14.106082 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5796698f9c-ltjks" Mar 19 12:29:14.432234 master-0 kubenswrapper[29612]: I0319 12:29:14.432079 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-rnrf5"] Mar 19 12:29:14.446164 master-0 kubenswrapper[29612]: I0319 12:29:14.445517 29612 scope.go:117] "RemoveContainer" containerID="b7555116d7d97d5e39ab90913282f9e6d09ee7bc33bd088a6b4197346688856f" Mar 19 12:29:14.446164 master-0 kubenswrapper[29612]: E0319 12:29:14.445830 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-c6b86788c-dtf24_openstack(7257edc5-e7bf-41f7-8c7d-97b4d4d83b64)\"" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" podUID="7257edc5-e7bf-41f7-8c7d-97b4d4d83b64" Mar 19 12:29:14.455892 master-0 kubenswrapper[29612]: I0319 12:29:14.455473 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b4b84764-w6p2r_37c49776-7c55-42c9-9765-18af1ce57ce4/neutron-api/0.log" Mar 19 12:29:14.455892 master-0 kubenswrapper[29612]: I0319 12:29:14.455794 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b4b84764-w6p2r" event={"ID":"37c49776-7c55-42c9-9765-18af1ce57ce4","Type":"ContainerDied","Data":"2231b252621b7676769dc4e9ccb64339a152dc5298136911320816ff7ed5d573"} Mar 19 12:29:14.455892 master-0 kubenswrapper[29612]: I0319 12:29:14.455838 29612 generic.go:334] "Generic (PLEG): container finished" podID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerID="2231b252621b7676769dc4e9ccb64339a152dc5298136911320816ff7ed5d573" exitCode=137 Mar 19 12:29:14.471679 master-0 kubenswrapper[29612]: I0319 12:29:14.471134 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77bcfcb874-5q7wl"] Mar 19 12:29:14.471679 master-0 kubenswrapper[29612]: I0319 12:29:14.471437 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77bcfcb874-5q7wl" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-api" containerID="cri-o://e49fb06ec118d446a356cd045b9abb7f736c5d0b5fb9674ec53fc16be70f30d4" gracePeriod=30 Mar 19 12:29:14.471679 master-0 kubenswrapper[29612]: I0319 12:29:14.471562 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77bcfcb874-5q7wl" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-httpd" containerID="cri-o://4902263a87f1037c2d5f01561807c0efbf819f76dac185c72d8320772c9ec2a0" gracePeriod=30 Mar 19 12:29:14.699548 master-0 kubenswrapper[29612]: I0319 12:29:14.699499 29612 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:29:14.700062 master-0 kubenswrapper[29612]: I0319 12:29:14.700025 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:29:14.700716 master-0 kubenswrapper[29612]: I0319 12:29:14.700574 29612 scope.go:117] "RemoveContainer" containerID="f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575" Mar 19 12:29:14.700859 master-0 kubenswrapper[29612]: E0319 12:29:14.700825 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-6d6786d9f7-6fz2d_openstack(2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2)\"" pod="openstack/ironic-6d6786d9f7-6fz2d" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" Mar 19 12:29:15.478819 master-0 kubenswrapper[29612]: I0319 12:29:15.478739 29612 scope.go:117] "RemoveContainer" containerID="f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575" Mar 19 12:29:15.479314 master-0 kubenswrapper[29612]: E0319 12:29:15.479000 29612 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-6d6786d9f7-6fz2d_openstack(2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2)\"" pod="openstack/ironic-6d6786d9f7-6fz2d" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" Mar 19 12:29:15.988846 master-0 kubenswrapper[29612]: I0319 12:29:15.988503 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-97d484c8-dtsbv"] Mar 19 12:29:15.991491 master-0 kubenswrapper[29612]: I0319 12:29:15.991432 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:15.994142 master-0 kubenswrapper[29612]: I0319 12:29:15.994075 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 12:29:15.994471 master-0 kubenswrapper[29612]: I0319 12:29:15.994397 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 12:29:15.994598 master-0 kubenswrapper[29612]: I0319 12:29:15.994578 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 12:29:16.034305 master-0 kubenswrapper[29612]: I0319 12:29:16.030233 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-etc-swift\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.034305 master-0 kubenswrapper[29612]: I0319 12:29:16.030359 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-combined-ca-bundle\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.034305 master-0 kubenswrapper[29612]: I0319 12:29:16.030432 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-public-tls-certs\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.034305 master-0 kubenswrapper[29612]: I0319 12:29:16.030454 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-run-httpd\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.047954 master-0 kubenswrapper[29612]: I0319 12:29:16.047902 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-internal-tls-certs\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.048164 master-0 kubenswrapper[29612]: I0319 12:29:16.047994 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-log-httpd\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.048896 master-0 kubenswrapper[29612]: I0319 12:29:16.048862 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-config-data\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.048982 master-0 kubenswrapper[29612]: I0319 12:29:16.048899 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjtz\" (UniqueName: \"kubernetes.io/projected/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-kube-api-access-pwjtz\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.058199 master-0 kubenswrapper[29612]: I0319 12:29:16.058136 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-97d484c8-dtsbv"] Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.168703 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-config-data\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.168780 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjtz\" (UniqueName: \"kubernetes.io/projected/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-kube-api-access-pwjtz\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.168881 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-etc-swift\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.168996 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-combined-ca-bundle\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.169026 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-public-tls-certs\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.169070 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-run-httpd\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.169154 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-internal-tls-certs\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.169665 master-0 kubenswrapper[29612]: I0319 12:29:16.169179 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-log-httpd\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.170267 master-0 kubenswrapper[29612]: I0319 12:29:16.169956 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-log-httpd\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.170933 master-0 kubenswrapper[29612]: I0319 12:29:16.170895 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-run-httpd\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.179735 master-0 kubenswrapper[29612]: I0319 12:29:16.179678 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-public-tls-certs\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.192840 master-0 kubenswrapper[29612]: I0319 12:29:16.192769 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-etc-swift\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.194031 master-0 kubenswrapper[29612]: I0319 12:29:16.193258 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-config-data\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.194031 master-0 kubenswrapper[29612]: I0319 12:29:16.193761 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-internal-tls-certs\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.198994 master-0 kubenswrapper[29612]: I0319 12:29:16.194527 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-combined-ca-bundle\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.209873 master-0 kubenswrapper[29612]: I0319 12:29:16.204352 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mncvw"] Mar 19 12:29:16.210121 master-0 kubenswrapper[29612]: I0319 12:29:16.209899 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.343841 master-0 kubenswrapper[29612]: I0319 12:29:16.343789 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mncvw"] Mar 19 12:29:16.375824 master-0 kubenswrapper[29612]: I0319 12:29:16.375755 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b120a0-2c7e-43de-982d-0edc209ebafa-operator-scripts\") pod \"nova-api-db-create-mncvw\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.376002 master-0 kubenswrapper[29612]: I0319 12:29:16.375934 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls7cg\" (UniqueName: \"kubernetes.io/projected/82b120a0-2c7e-43de-982d-0edc209ebafa-kube-api-access-ls7cg\") pod \"nova-api-db-create-mncvw\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.424567 master-0 kubenswrapper[29612]: I0319 12:29:16.424510 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjtz\" (UniqueName: \"kubernetes.io/projected/9836e2b7-486b-4c01-ad9a-f63c9a0f08e3-kube-api-access-pwjtz\") pod \"swift-proxy-97d484c8-dtsbv\" (UID: \"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3\") " pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:16.479661 master-0 kubenswrapper[29612]: I0319 12:29:16.478689 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b120a0-2c7e-43de-982d-0edc209ebafa-operator-scripts\") pod \"nova-api-db-create-mncvw\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.479661 master-0 kubenswrapper[29612]: I0319 12:29:16.478940 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls7cg\" (UniqueName: \"kubernetes.io/projected/82b120a0-2c7e-43de-982d-0edc209ebafa-kube-api-access-ls7cg\") pod \"nova-api-db-create-mncvw\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.484163 master-0 kubenswrapper[29612]: I0319 12:29:16.481795 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b120a0-2c7e-43de-982d-0edc209ebafa-operator-scripts\") pod \"nova-api-db-create-mncvw\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.612741 master-0 kubenswrapper[29612]: I0319 12:29:16.605432 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls7cg\" (UniqueName: \"kubernetes.io/projected/82b120a0-2c7e-43de-982d-0edc209ebafa-kube-api-access-ls7cg\") pod \"nova-api-db-create-mncvw\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.629670 master-0 kubenswrapper[29612]: I0319 12:29:16.624582 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:16.656787 master-0 kubenswrapper[29612]: I0319 12:29:16.656716 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:17.004126 master-0 kubenswrapper[29612]: I0319 12:29:17.004053 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-c8pd5"] Mar 19 12:29:17.006658 master-0 kubenswrapper[29612]: I0319 12:29:17.005687 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:17.098251 master-0 kubenswrapper[29612]: I0319 12:29:17.097260 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76f6f\" (UniqueName: \"kubernetes.io/projected/57ef2db4-4e57-40b2-9293-8691f71495bd-kube-api-access-76f6f\") pod \"nova-cell0-db-create-c8pd5\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:17.098251 master-0 kubenswrapper[29612]: I0319 12:29:17.097442 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57ef2db4-4e57-40b2-9293-8691f71495bd-operator-scripts\") pod \"nova-cell0-db-create-c8pd5\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:17.141936 master-0 kubenswrapper[29612]: I0319 12:29:17.141489 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c8pd5"] Mar 19 12:29:17.146479 master-0 kubenswrapper[29612]: I0319 12:29:17.146425 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b4b84764-w6p2r_37c49776-7c55-42c9-9765-18af1ce57ce4/neutron-api/0.log" Mar 19 12:29:17.146590 master-0 kubenswrapper[29612]: I0319 12:29:17.146521 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:29:17.216993 master-0 kubenswrapper[29612]: I0319 12:29:17.216619 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76f6f\" (UniqueName: \"kubernetes.io/projected/57ef2db4-4e57-40b2-9293-8691f71495bd-kube-api-access-76f6f\") pod \"nova-cell0-db-create-c8pd5\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:17.216993 master-0 kubenswrapper[29612]: I0319 12:29:17.216957 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57ef2db4-4e57-40b2-9293-8691f71495bd-operator-scripts\") pod \"nova-cell0-db-create-c8pd5\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:17.221764 master-0 kubenswrapper[29612]: I0319 12:29:17.217898 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57ef2db4-4e57-40b2-9293-8691f71495bd-operator-scripts\") pod \"nova-cell0-db-create-c8pd5\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:17.327933 master-0 kubenswrapper[29612]: I0319 12:29:17.327506 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-httpd-config\") pod \"37c49776-7c55-42c9-9765-18af1ce57ce4\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " Mar 19 12:29:17.327933 master-0 kubenswrapper[29612]: I0319 12:29:17.327668 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jkv\" (UniqueName: \"kubernetes.io/projected/37c49776-7c55-42c9-9765-18af1ce57ce4-kube-api-access-88jkv\") pod \"37c49776-7c55-42c9-9765-18af1ce57ce4\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " Mar 19 12:29:17.327933 master-0 kubenswrapper[29612]: I0319 12:29:17.327723 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-combined-ca-bundle\") pod \"37c49776-7c55-42c9-9765-18af1ce57ce4\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " Mar 19 12:29:17.329364 master-0 kubenswrapper[29612]: I0319 12:29:17.328329 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-config\") pod \"37c49776-7c55-42c9-9765-18af1ce57ce4\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " Mar 19 12:29:17.329364 master-0 kubenswrapper[29612]: I0319 12:29:17.328489 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-ovndb-tls-certs\") pod \"37c49776-7c55-42c9-9765-18af1ce57ce4\" (UID: \"37c49776-7c55-42c9-9765-18af1ce57ce4\") " Mar 19 12:29:17.333865 master-0 kubenswrapper[29612]: I0319 12:29:17.333815 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c49776-7c55-42c9-9765-18af1ce57ce4-kube-api-access-88jkv" (OuterVolumeSpecName: "kube-api-access-88jkv") pod "37c49776-7c55-42c9-9765-18af1ce57ce4" (UID: "37c49776-7c55-42c9-9765-18af1ce57ce4"). InnerVolumeSpecName "kube-api-access-88jkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:17.334581 master-0 kubenswrapper[29612]: I0319 12:29:17.334434 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "37c49776-7c55-42c9-9765-18af1ce57ce4" (UID: "37c49776-7c55-42c9-9765-18af1ce57ce4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:17.424695 master-0 kubenswrapper[29612]: I0319 12:29:17.424615 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37c49776-7c55-42c9-9765-18af1ce57ce4" (UID: "37c49776-7c55-42c9-9765-18af1ce57ce4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:17.432358 master-0 kubenswrapper[29612]: I0319 12:29:17.432283 29612 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:17.432358 master-0 kubenswrapper[29612]: I0319 12:29:17.432320 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jkv\" (UniqueName: \"kubernetes.io/projected/37c49776-7c55-42c9-9765-18af1ce57ce4-kube-api-access-88jkv\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:17.432358 master-0 kubenswrapper[29612]: I0319 12:29:17.432332 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:17.457549 master-0 kubenswrapper[29612]: I0319 12:29:17.457492 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-config" (OuterVolumeSpecName: "config") pod "37c49776-7c55-42c9-9765-18af1ce57ce4" (UID: "37c49776-7c55-42c9-9765-18af1ce57ce4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:17.479763 master-0 kubenswrapper[29612]: I0319 12:29:17.479713 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "37c49776-7c55-42c9-9765-18af1ce57ce4" (UID: "37c49776-7c55-42c9-9765-18af1ce57ce4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:17.545138 master-0 kubenswrapper[29612]: I0319 12:29:17.535142 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:17.545138 master-0 kubenswrapper[29612]: I0319 12:29:17.535488 29612 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c49776-7c55-42c9-9765-18af1ce57ce4-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:17.558421 master-0 kubenswrapper[29612]: I0319 12:29:17.558381 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b4b84764-w6p2r_37c49776-7c55-42c9-9765-18af1ce57ce4/neutron-api/0.log" Mar 19 12:29:17.558621 master-0 kubenswrapper[29612]: I0319 12:29:17.558518 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b4b84764-w6p2r" Mar 19 12:29:17.559470 master-0 kubenswrapper[29612]: I0319 12:29:17.558785 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b4b84764-w6p2r" event={"ID":"37c49776-7c55-42c9-9765-18af1ce57ce4","Type":"ContainerDied","Data":"32c6bd9a55e502753c61c7a18babe7ad035c98b11835a9fa8cfe5f8fe5ae9610"} Mar 19 12:29:17.559470 master-0 kubenswrapper[29612]: I0319 12:29:17.558855 29612 scope.go:117] "RemoveContainer" containerID="e928c6e701b6e0763a24d43a54cfeafbc0cc4b69443a3947128265ae31ca4d04" Mar 19 12:29:17.562129 master-0 kubenswrapper[29612]: I0319 12:29:17.562064 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rnrf5" event={"ID":"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a","Type":"ContainerStarted","Data":"46a4c0bc3dffa290a3b6c7a36bd06d14c97502cf7029bd1182de097d9d948fb0"} Mar 19 12:29:17.564745 master-0 kubenswrapper[29612]: I0319 12:29:17.564684 29612 generic.go:334] "Generic (PLEG): container finished" podID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerID="4902263a87f1037c2d5f01561807c0efbf819f76dac185c72d8320772c9ec2a0" exitCode=0 Mar 19 12:29:17.564745 master-0 kubenswrapper[29612]: I0319 12:29:17.564713 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bcfcb874-5q7wl" event={"ID":"966779b3-45a4-4c95-8dfb-5e2b42f42cc1","Type":"ContainerDied","Data":"4902263a87f1037c2d5f01561807c0efbf819f76dac185c72d8320772c9ec2a0"} Mar 19 12:29:17.583023 master-0 kubenswrapper[29612]: I0319 12:29:17.582976 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8d17-account-create-update-twzr4"] Mar 19 12:29:17.583620 master-0 kubenswrapper[29612]: E0319 12:29:17.583592 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-api" Mar 19 12:29:17.583620 master-0 kubenswrapper[29612]: I0319 12:29:17.583614 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-api" Mar 19 12:29:17.583724 master-0 kubenswrapper[29612]: E0319 12:29:17.583664 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-httpd" Mar 19 12:29:17.583724 master-0 kubenswrapper[29612]: I0319 12:29:17.583672 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-httpd" Mar 19 12:29:17.584741 master-0 kubenswrapper[29612]: I0319 12:29:17.584620 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-httpd" Mar 19 12:29:17.584741 master-0 kubenswrapper[29612]: I0319 12:29:17.584670 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" containerName="neutron-api" Mar 19 12:29:17.585882 master-0 kubenswrapper[29612]: I0319 12:29:17.585837 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:17.588646 master-0 kubenswrapper[29612]: I0319 12:29:17.588608 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 12:29:17.652321 master-0 kubenswrapper[29612]: I0319 12:29:17.648411 29612 scope.go:117] "RemoveContainer" containerID="2231b252621b7676769dc4e9ccb64339a152dc5298136911320816ff7ed5d573" Mar 19 12:29:17.741510 master-0 kubenswrapper[29612]: I0319 12:29:17.741464 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-operator-scripts\") pod \"nova-cell0-8d17-account-create-update-twzr4\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:17.741589 master-0 kubenswrapper[29612]: I0319 12:29:17.741570 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnx66\" (UniqueName: \"kubernetes.io/projected/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-kube-api-access-cnx66\") pod \"nova-cell0-8d17-account-create-update-twzr4\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:17.843865 master-0 kubenswrapper[29612]: I0319 12:29:17.843666 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-operator-scripts\") pod \"nova-cell0-8d17-account-create-update-twzr4\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:17.844084 master-0 kubenswrapper[29612]: I0319 12:29:17.843861 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnx66\" (UniqueName: \"kubernetes.io/projected/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-kube-api-access-cnx66\") pod \"nova-cell0-8d17-account-create-update-twzr4\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:17.845726 master-0 kubenswrapper[29612]: I0319 12:29:17.845686 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-operator-scripts\") pod \"nova-cell0-8d17-account-create-update-twzr4\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:17.930196 master-0 kubenswrapper[29612]: I0319 12:29:17.930129 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6ksst"] Mar 19 12:29:17.931711 master-0 kubenswrapper[29612]: I0319 12:29:17.931685 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:17.940185 master-0 kubenswrapper[29612]: I0319 12:29:17.940125 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-74f4876497-jcnql" Mar 19 12:29:18.049122 master-0 kubenswrapper[29612]: I0319 12:29:18.049072 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dff3f9d-f585-4561-a85e-51a6ba900d33-operator-scripts\") pod \"nova-cell1-db-create-6ksst\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:18.049460 master-0 kubenswrapper[29612]: I0319 12:29:18.049437 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/0dff3f9d-f585-4561-a85e-51a6ba900d33-kube-api-access-lkccq\") pod \"nova-cell1-db-create-6ksst\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:18.152228 master-0 kubenswrapper[29612]: I0319 12:29:18.151997 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dff3f9d-f585-4561-a85e-51a6ba900d33-operator-scripts\") pod \"nova-cell1-db-create-6ksst\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:18.152551 master-0 kubenswrapper[29612]: I0319 12:29:18.152501 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/0dff3f9d-f585-4561-a85e-51a6ba900d33-kube-api-access-lkccq\") pod \"nova-cell1-db-create-6ksst\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:18.153342 master-0 kubenswrapper[29612]: I0319 12:29:18.153307 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dff3f9d-f585-4561-a85e-51a6ba900d33-operator-scripts\") pod \"nova-cell1-db-create-6ksst\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:18.990151 master-0 kubenswrapper[29612]: I0319 12:29:18.990082 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8d17-account-create-update-twzr4"] Mar 19 12:29:18.994740 master-0 kubenswrapper[29612]: I0319 12:29:18.994659 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76f6f\" (UniqueName: \"kubernetes.io/projected/57ef2db4-4e57-40b2-9293-8691f71495bd-kube-api-access-76f6f\") pod \"nova-cell0-db-create-c8pd5\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:19.059469 master-0 kubenswrapper[29612]: I0319 12:29:19.059407 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6ksst"] Mar 19 12:29:19.197554 master-0 kubenswrapper[29612]: I0319 12:29:19.197440 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:20.291889 master-0 kubenswrapper[29612]: W0319 12:29:20.290676 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82b120a0_2c7e_43de_982d_0edc209ebafa.slice/crio-6400bf0e9a0d90a70ff38cf462480ba170ea25415ae8060174c55de3a6eb7775 WatchSource:0}: Error finding container 6400bf0e9a0d90a70ff38cf462480ba170ea25415ae8060174c55de3a6eb7775: Status 404 returned error can't find the container with id 6400bf0e9a0d90a70ff38cf462480ba170ea25415ae8060174c55de3a6eb7775 Mar 19 12:29:20.301800 master-0 kubenswrapper[29612]: I0319 12:29:20.301732 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/0dff3f9d-f585-4561-a85e-51a6ba900d33-kube-api-access-lkccq\") pod \"nova-cell1-db-create-6ksst\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:20.305991 master-0 kubenswrapper[29612]: I0319 12:29:20.305401 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-97d484c8-dtsbv"] Mar 19 12:29:20.306608 master-0 kubenswrapper[29612]: I0319 12:29:20.306544 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnx66\" (UniqueName: \"kubernetes.io/projected/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-kube-api-access-cnx66\") pod \"nova-cell0-8d17-account-create-update-twzr4\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:20.353019 master-0 kubenswrapper[29612]: I0319 12:29:20.352853 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:20.364684 master-0 kubenswrapper[29612]: I0319 12:29:20.360467 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mncvw"] Mar 19 12:29:20.606941 master-0 kubenswrapper[29612]: I0319 12:29:20.606889 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:20.625665 master-0 kubenswrapper[29612]: I0319 12:29:20.625595 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mncvw" event={"ID":"82b120a0-2c7e-43de-982d-0edc209ebafa","Type":"ContainerStarted","Data":"6400bf0e9a0d90a70ff38cf462480ba170ea25415ae8060174c55de3a6eb7775"} Mar 19 12:29:20.627490 master-0 kubenswrapper[29612]: I0319 12:29:20.627440 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-97d484c8-dtsbv" event={"ID":"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3","Type":"ContainerStarted","Data":"3c44683d7ad62f6c6640660dcf9f4819eaf5493d1dc4f2ad2fc8f1edc02b3a7d"} Mar 19 12:29:21.263145 master-0 kubenswrapper[29612]: I0319 12:29:21.262902 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b4b84764-w6p2r"] Mar 19 12:29:21.282728 master-0 kubenswrapper[29612]: I0319 12:29:21.281412 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-facf-account-create-update-f5254"] Mar 19 12:29:21.284015 master-0 kubenswrapper[29612]: I0319 12:29:21.283273 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.285519 master-0 kubenswrapper[29612]: I0319 12:29:21.285465 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 12:29:21.305308 master-0 kubenswrapper[29612]: I0319 12:29:21.305247 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c8pd5"] Mar 19 12:29:21.353569 master-0 kubenswrapper[29612]: I0319 12:29:21.353524 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6ksst"] Mar 19 12:29:21.373425 master-0 kubenswrapper[29612]: I0319 12:29:21.373357 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-operator-scripts\") pod \"nova-api-facf-account-create-update-f5254\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.373656 master-0 kubenswrapper[29612]: I0319 12:29:21.373601 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksxr\" (UniqueName: \"kubernetes.io/projected/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-kube-api-access-4ksxr\") pod \"nova-api-facf-account-create-update-f5254\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.415442 master-0 kubenswrapper[29612]: I0319 12:29:21.414406 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-facf-account-create-update-f5254"] Mar 19 12:29:21.452546 master-0 kubenswrapper[29612]: I0319 12:29:21.452479 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b4b84764-w6p2r"] Mar 19 12:29:21.476290 master-0 kubenswrapper[29612]: I0319 12:29:21.476227 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksxr\" (UniqueName: \"kubernetes.io/projected/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-kube-api-access-4ksxr\") pod \"nova-api-facf-account-create-update-f5254\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.476493 master-0 kubenswrapper[29612]: I0319 12:29:21.476463 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-operator-scripts\") pod \"nova-api-facf-account-create-update-f5254\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.477476 master-0 kubenswrapper[29612]: I0319 12:29:21.477444 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-operator-scripts\") pod \"nova-api-facf-account-create-update-f5254\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.524151 master-0 kubenswrapper[29612]: I0319 12:29:21.524094 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksxr\" (UniqueName: \"kubernetes.io/projected/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-kube-api-access-4ksxr\") pod \"nova-api-facf-account-create-update-f5254\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.641137 master-0 kubenswrapper[29612]: I0319 12:29:21.641090 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-97d484c8-dtsbv" event={"ID":"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3","Type":"ContainerStarted","Data":"4b491768532eed3b5f712ab016d03a5a8176a27d021f7cea78769f761bf8deaf"} Mar 19 12:29:21.642795 master-0 kubenswrapper[29612]: I0319 12:29:21.642776 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mncvw" event={"ID":"82b120a0-2c7e-43de-982d-0edc209ebafa","Type":"ContainerStarted","Data":"d450d01262459a2f9f687bea1cbd3a45bacd3a7695b6dd2345aeeda304fbba22"} Mar 19 12:29:21.669155 master-0 kubenswrapper[29612]: I0319 12:29:21.669115 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:21.772056 master-0 kubenswrapper[29612]: I0319 12:29:21.771836 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7e01-account-create-update-9bxmr"] Mar 19 12:29:21.773225 master-0 kubenswrapper[29612]: I0319 12:29:21.773201 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:21.777054 master-0 kubenswrapper[29612]: I0319 12:29:21.777005 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 12:29:21.888424 master-0 kubenswrapper[29612]: I0319 12:29:21.888197 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dd055fb-058b-4a5b-ba29-b376e81cada7-operator-scripts\") pod \"nova-cell1-7e01-account-create-update-9bxmr\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:21.888424 master-0 kubenswrapper[29612]: I0319 12:29:21.888311 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs928\" (UniqueName: \"kubernetes.io/projected/2dd055fb-058b-4a5b-ba29-b376e81cada7-kube-api-access-qs928\") pod \"nova-cell1-7e01-account-create-update-9bxmr\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:21.991187 master-0 kubenswrapper[29612]: I0319 12:29:21.990704 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs928\" (UniqueName: \"kubernetes.io/projected/2dd055fb-058b-4a5b-ba29-b376e81cada7-kube-api-access-qs928\") pod \"nova-cell1-7e01-account-create-update-9bxmr\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:21.991187 master-0 kubenswrapper[29612]: I0319 12:29:21.990922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dd055fb-058b-4a5b-ba29-b376e81cada7-operator-scripts\") pod \"nova-cell1-7e01-account-create-update-9bxmr\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:21.991724 master-0 kubenswrapper[29612]: I0319 12:29:21.991662 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dd055fb-058b-4a5b-ba29-b376e81cada7-operator-scripts\") pod \"nova-cell1-7e01-account-create-update-9bxmr\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:22.227871 master-0 kubenswrapper[29612]: I0319 12:29:22.227773 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7e01-account-create-update-9bxmr"] Mar 19 12:29:22.408235 master-0 kubenswrapper[29612]: I0319 12:29:22.407407 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs928\" (UniqueName: \"kubernetes.io/projected/2dd055fb-058b-4a5b-ba29-b376e81cada7-kube-api-access-qs928\") pod \"nova-cell1-7e01-account-create-update-9bxmr\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:22.584371 master-0 kubenswrapper[29612]: I0319 12:29:22.583766 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-6d6786d9f7-6fz2d"] Mar 19 12:29:22.584371 master-0 kubenswrapper[29612]: I0319 12:29:22.584112 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-6d6786d9f7-6fz2d" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api-log" containerID="cri-o://436b9e409e3c03f1804e33525028cdf07ca820feb1e1199bc639f5f9b01f3e38" gracePeriod=60 Mar 19 12:29:22.625080 master-0 kubenswrapper[29612]: I0319 12:29:22.624969 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-mncvw" podStartSLOduration=6.6249473 podStartE2EDuration="6.6249473s" podCreationTimestamp="2026-03-19 12:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:22.607967327 +0000 UTC m=+1169.922136348" watchObservedRunningTime="2026-03-19 12:29:22.6249473 +0000 UTC m=+1169.939116321" Mar 19 12:29:22.692759 master-0 kubenswrapper[29612]: I0319 12:29:22.690919 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:22.928004 master-0 kubenswrapper[29612]: I0319 12:29:22.927939 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c49776-7c55-42c9-9765-18af1ce57ce4" path="/var/lib/kubelet/pods/37c49776-7c55-42c9-9765-18af1ce57ce4/volumes" Mar 19 12:29:27.907090 master-0 kubenswrapper[29612]: I0319 12:29:27.907016 29612 scope.go:117] "RemoveContainer" containerID="b7555116d7d97d5e39ab90913282f9e6d09ee7bc33bd088a6b4197346688856f" Mar 19 12:29:28.176120 master-0 kubenswrapper[29612]: W0319 12:29:28.176073 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57ef2db4_4e57_40b2_9293_8691f71495bd.slice/crio-7eeaac0af73dd3a85ce7c90bc53367517d5660bf3801ca5c17278fdbdf64066a WatchSource:0}: Error finding container 7eeaac0af73dd3a85ce7c90bc53367517d5660bf3801ca5c17278fdbdf64066a: Status 404 returned error can't find the container with id 7eeaac0af73dd3a85ce7c90bc53367517d5660bf3801ca5c17278fdbdf64066a Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.786727 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6ksst" event={"ID":"0dff3f9d-f585-4561-a85e-51a6ba900d33","Type":"ContainerStarted","Data":"00b75a2f4a51f44b2c0a93e7e4005e5b061f6b8790dd9b3e4c88ecb7c8562bd9"} Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.789345 29612 generic.go:334] "Generic (PLEG): container finished" podID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerID="e49fb06ec118d446a356cd045b9abb7f736c5d0b5fb9674ec53fc16be70f30d4" exitCode=0 Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.789398 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bcfcb874-5q7wl" event={"ID":"966779b3-45a4-4c95-8dfb-5e2b42f42cc1","Type":"ContainerDied","Data":"e49fb06ec118d446a356cd045b9abb7f736c5d0b5fb9674ec53fc16be70f30d4"} Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.792289 29612 generic.go:334] "Generic (PLEG): container finished" podID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerID="436b9e409e3c03f1804e33525028cdf07ca820feb1e1199bc639f5f9b01f3e38" exitCode=143 Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.792361 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerDied","Data":"436b9e409e3c03f1804e33525028cdf07ca820feb1e1199bc639f5f9b01f3e38"} Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.794190 29612 generic.go:334] "Generic (PLEG): container finished" podID="82b120a0-2c7e-43de-982d-0edc209ebafa" containerID="d450d01262459a2f9f687bea1cbd3a45bacd3a7695b6dd2345aeeda304fbba22" exitCode=0 Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.794237 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mncvw" event={"ID":"82b120a0-2c7e-43de-982d-0edc209ebafa","Type":"ContainerDied","Data":"d450d01262459a2f9f687bea1cbd3a45bacd3a7695b6dd2345aeeda304fbba22"} Mar 19 12:29:28.799063 master-0 kubenswrapper[29612]: I0319 12:29:28.797283 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c8pd5" event={"ID":"57ef2db4-4e57-40b2-9293-8691f71495bd","Type":"ContainerStarted","Data":"7eeaac0af73dd3a85ce7c90bc53367517d5660bf3801ca5c17278fdbdf64066a"} Mar 19 12:29:30.621271 master-0 kubenswrapper[29612]: I0319 12:29:30.620067 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:29:30.638063 master-0 kubenswrapper[29612]: I0319 12:29:30.637684 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-scripts\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638063 master-0 kubenswrapper[29612]: I0319 12:29:30.637765 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-etc-podinfo\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638063 master-0 kubenswrapper[29612]: I0319 12:29:30.637827 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638063 master-0 kubenswrapper[29612]: I0319 12:29:30.637865 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638063 master-0 kubenswrapper[29612]: I0319 12:29:30.637911 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-logs\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638063 master-0 kubenswrapper[29612]: I0319 12:29:30.637995 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-custom\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638591 master-0 kubenswrapper[29612]: I0319 12:29:30.638092 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-merged\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.638591 master-0 kubenswrapper[29612]: I0319 12:29:30.638242 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mrjg\" (UniqueName: \"kubernetes.io/projected/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-kube-api-access-4mrjg\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:30.639844 master-0 kubenswrapper[29612]: I0319 12:29:30.638865 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:29:30.641272 master-0 kubenswrapper[29612]: I0319 12:29:30.640769 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-logs" (OuterVolumeSpecName: "logs") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:30.645133 master-0 kubenswrapper[29612]: I0319 12:29:30.644218 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.648167 master-0 kubenswrapper[29612]: I0319 12:29:30.646995 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:30.648250 master-0 kubenswrapper[29612]: I0319 12:29:30.648179 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:30.657316 master-0 kubenswrapper[29612]: I0319 12:29:30.650830 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-kube-api-access-4mrjg" (OuterVolumeSpecName: "kube-api-access-4mrjg") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "kube-api-access-4mrjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:30.746473 master-0 kubenswrapper[29612]: I0319 12:29:30.745334 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b120a0-2c7e-43de-982d-0edc209ebafa-operator-scripts\") pod \"82b120a0-2c7e-43de-982d-0edc209ebafa\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " Mar 19 12:29:30.746473 master-0 kubenswrapper[29612]: I0319 12:29:30.745413 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-config\") pod \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " Mar 19 12:29:30.746473 master-0 kubenswrapper[29612]: I0319 12:29:30.745672 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls7cg\" (UniqueName: \"kubernetes.io/projected/82b120a0-2c7e-43de-982d-0edc209ebafa-kube-api-access-ls7cg\") pod \"82b120a0-2c7e-43de-982d-0edc209ebafa\" (UID: \"82b120a0-2c7e-43de-982d-0edc209ebafa\") " Mar 19 12:29:30.746473 master-0 kubenswrapper[29612]: I0319 12:29:30.745731 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-httpd-config\") pod \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " Mar 19 12:29:30.746473 master-0 kubenswrapper[29612]: I0319 12:29:30.745787 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-ovndb-tls-certs\") pod \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " Mar 19 12:29:30.747030 master-0 kubenswrapper[29612]: I0319 12:29:30.746574 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-combined-ca-bundle\") pod \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " Mar 19 12:29:30.747030 master-0 kubenswrapper[29612]: I0319 12:29:30.746749 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vwtx\" (UniqueName: \"kubernetes.io/projected/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-kube-api-access-7vwtx\") pod \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\" (UID: \"966779b3-45a4-4c95-8dfb-5e2b42f42cc1\") " Mar 19 12:29:30.747573 master-0 kubenswrapper[29612]: I0319 12:29:30.747494 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82b120a0-2c7e-43de-982d-0edc209ebafa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82b120a0-2c7e-43de-982d-0edc209ebafa" (UID: "82b120a0-2c7e-43de-982d-0edc209ebafa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:30.747676 master-0 kubenswrapper[29612]: I0319 12:29:30.747629 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mrjg\" (UniqueName: \"kubernetes.io/projected/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-kube-api-access-4mrjg\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.747735 master-0 kubenswrapper[29612]: I0319 12:29:30.747676 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.783313 master-0 kubenswrapper[29612]: I0319 12:29:30.783192 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 12:29:30.801981 master-0 kubenswrapper[29612]: I0319 12:29:30.801855 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-facf-account-create-update-f5254"] Mar 19 12:29:30.827495 master-0 kubenswrapper[29612]: I0319 12:29:30.827380 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-scripts" (OuterVolumeSpecName: "scripts") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:30.828712 master-0 kubenswrapper[29612]: I0319 12:29:30.828245 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:30.843984 master-0 kubenswrapper[29612]: I0319 12:29:30.842627 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "966779b3-45a4-4c95-8dfb-5e2b42f42cc1" (UID: "966779b3-45a4-4c95-8dfb-5e2b42f42cc1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:30.843984 master-0 kubenswrapper[29612]: I0319 12:29:30.843351 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-kube-api-access-7vwtx" (OuterVolumeSpecName: "kube-api-access-7vwtx") pod "966779b3-45a4-4c95-8dfb-5e2b42f42cc1" (UID: "966779b3-45a4-4c95-8dfb-5e2b42f42cc1"). InnerVolumeSpecName "kube-api-access-7vwtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:30.850968 master-0 kubenswrapper[29612]: I0319 12:29:30.849991 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b120a0-2c7e-43de-982d-0edc209ebafa-kube-api-access-ls7cg" (OuterVolumeSpecName: "kube-api-access-ls7cg") pod "82b120a0-2c7e-43de-982d-0edc209ebafa" (UID: "82b120a0-2c7e-43de-982d-0edc209ebafa"). InnerVolumeSpecName "kube-api-access-ls7cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853412 29612 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853468 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vwtx\" (UniqueName: \"kubernetes.io/projected/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-kube-api-access-7vwtx\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853482 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b120a0-2c7e-43de-982d-0edc209ebafa-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853495 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls7cg\" (UniqueName: \"kubernetes.io/projected/82b120a0-2c7e-43de-982d-0edc209ebafa-kube-api-access-ls7cg\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853507 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853519 29612 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.854055 master-0 kubenswrapper[29612]: I0319 12:29:30.853534 29612 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:30.879643 master-0 kubenswrapper[29612]: I0319 12:29:30.879347 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77bcfcb874-5q7wl" event={"ID":"966779b3-45a4-4c95-8dfb-5e2b42f42cc1","Type":"ContainerDied","Data":"8589e552d744763a24469e1f2016f838bf656fe359ba14b852b568a5e90b6304"} Mar 19 12:29:30.879643 master-0 kubenswrapper[29612]: I0319 12:29:30.879584 29612 scope.go:117] "RemoveContainer" containerID="4902263a87f1037c2d5f01561807c0efbf819f76dac185c72d8320772c9ec2a0" Mar 19 12:29:30.884688 master-0 kubenswrapper[29612]: I0319 12:29:30.884005 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77bcfcb874-5q7wl" Mar 19 12:29:30.901183 master-0 kubenswrapper[29612]: I0319 12:29:30.901104 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7e01-account-create-update-9bxmr"] Mar 19 12:29:30.929156 master-0 kubenswrapper[29612]: I0319 12:29:30.929022 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6d6786d9f7-6fz2d" Mar 19 12:29:30.938760 master-0 kubenswrapper[29612]: I0319 12:29:30.938715 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mncvw" Mar 19 12:29:30.992886 master-0 kubenswrapper[29612]: I0319 12:29:30.988347 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8d17-account-create-update-twzr4"] Mar 19 12:29:30.992886 master-0 kubenswrapper[29612]: I0319 12:29:30.988404 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6d6786d9f7-6fz2d" event={"ID":"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2","Type":"ContainerDied","Data":"3c152436dd8bdc133c4952dffd6d22a6532bea2c8564b73f1b33a7d104acb73b"} Mar 19 12:29:30.992886 master-0 kubenswrapper[29612]: I0319 12:29:30.988439 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mncvw" event={"ID":"82b120a0-2c7e-43de-982d-0edc209ebafa","Type":"ContainerDied","Data":"6400bf0e9a0d90a70ff38cf462480ba170ea25415ae8060174c55de3a6eb7775"} Mar 19 12:29:30.992886 master-0 kubenswrapper[29612]: I0319 12:29:30.988459 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6400bf0e9a0d90a70ff38cf462480ba170ea25415ae8060174c55de3a6eb7775" Mar 19 12:29:31.345673 master-0 kubenswrapper[29612]: I0319 12:29:31.345577 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data" (OuterVolumeSpecName: "config-data") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:31.370264 master-0 kubenswrapper[29612]: I0319 12:29:31.369895 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:31.410368 master-0 kubenswrapper[29612]: I0319 12:29:31.409801 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle\") pod \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\" (UID: \"2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2\") " Mar 19 12:29:31.412689 master-0 kubenswrapper[29612]: I0319 12:29:31.410762 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:31.412689 master-0 kubenswrapper[29612]: W0319 12:29:31.410876 29612 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2/volumes/kubernetes.io~secret/combined-ca-bundle Mar 19 12:29:31.412689 master-0 kubenswrapper[29612]: I0319 12:29:31.410890 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" (UID: "2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:31.412977 master-0 kubenswrapper[29612]: I0319 12:29:31.412872 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-config" (OuterVolumeSpecName: "config") pod "966779b3-45a4-4c95-8dfb-5e2b42f42cc1" (UID: "966779b3-45a4-4c95-8dfb-5e2b42f42cc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:31.419688 master-0 kubenswrapper[29612]: I0319 12:29:31.418832 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "966779b3-45a4-4c95-8dfb-5e2b42f42cc1" (UID: "966779b3-45a4-4c95-8dfb-5e2b42f42cc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:31.472672 master-0 kubenswrapper[29612]: I0319 12:29:31.470394 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "966779b3-45a4-4c95-8dfb-5e2b42f42cc1" (UID: "966779b3-45a4-4c95-8dfb-5e2b42f42cc1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:31.499607 master-0 kubenswrapper[29612]: I0319 12:29:31.496280 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:29:31.499607 master-0 kubenswrapper[29612]: I0319 12:29:31.496515 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3abc0-default-external-api-0" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-log" containerID="cri-o://bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188" gracePeriod=30 Mar 19 12:29:31.499607 master-0 kubenswrapper[29612]: I0319 12:29:31.496838 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3abc0-default-external-api-0" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-httpd" containerID="cri-o://5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d" gracePeriod=30 Mar 19 12:29:31.516752 master-0 kubenswrapper[29612]: I0319 12:29:31.514852 29612 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:31.516752 master-0 kubenswrapper[29612]: I0319 12:29:31.514916 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:31.516752 master-0 kubenswrapper[29612]: I0319 12:29:31.514930 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:31.516752 master-0 kubenswrapper[29612]: I0319 12:29:31.514942 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/966779b3-45a4-4c95-8dfb-5e2b42f42cc1-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:31.646722 master-0 kubenswrapper[29612]: I0319 12:29:31.643784 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-6d6786d9f7-6fz2d"] Mar 19 12:29:31.671163 master-0 kubenswrapper[29612]: I0319 12:29:31.671075 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-6d6786d9f7-6fz2d"] Mar 19 12:29:31.687673 master-0 kubenswrapper[29612]: I0319 12:29:31.684766 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77bcfcb874-5q7wl"] Mar 19 12:29:31.700796 master-0 kubenswrapper[29612]: I0319 12:29:31.697968 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77bcfcb874-5q7wl"] Mar 19 12:29:31.968914 master-0 kubenswrapper[29612]: I0319 12:29:31.968472 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rnrf5" event={"ID":"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a","Type":"ContainerStarted","Data":"baccff7a238191e9b496ab9f8ea06189590b4e026b36e8d87e59ff8ec07742c2"} Mar 19 12:29:31.971934 master-0 kubenswrapper[29612]: I0319 12:29:31.971460 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6ksst" event={"ID":"0dff3f9d-f585-4561-a85e-51a6ba900d33","Type":"ContainerStarted","Data":"e4b9a2b00f0d183bbaab95fb3a2754c533cc12558689257201027ab17a96e5a6"} Mar 19 12:29:31.975257 master-0 kubenswrapper[29612]: I0319 12:29:31.975211 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-97d484c8-dtsbv" event={"ID":"9836e2b7-486b-4c01-ad9a-f63c9a0f08e3","Type":"ContainerStarted","Data":"5f4258c0318df81eb291abfc8432e8710e93fa90972edd02ed9884da53aa6601"} Mar 19 12:29:31.976560 master-0 kubenswrapper[29612]: I0319 12:29:31.976080 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:31.976560 master-0 kubenswrapper[29612]: I0319 12:29:31.976318 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:31.984574 master-0 kubenswrapper[29612]: I0319 12:29:31.984509 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-97d484c8-dtsbv" podUID="9836e2b7-486b-4c01-ad9a-f63c9a0f08e3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 12:29:31.985756 master-0 kubenswrapper[29612]: I0319 12:29:31.985716 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" event={"ID":"7257edc5-e7bf-41f7-8c7d-97b4d4d83b64","Type":"ContainerStarted","Data":"ecc9eef703e7b3c20e8d71e5143b6c2c995f4bddfb15d0f496c181d1132b2789"} Mar 19 12:29:31.986738 master-0 kubenswrapper[29612]: I0319 12:29:31.986629 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:29:32.003889 master-0 kubenswrapper[29612]: I0319 12:29:32.003776 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-rnrf5" podStartSLOduration=5.880541727 podStartE2EDuration="19.003749154s" podCreationTimestamp="2026-03-19 12:29:13 +0000 UTC" firstStartedPulling="2026-03-19 12:29:17.068136614 +0000 UTC m=+1164.382305635" lastFinishedPulling="2026-03-19 12:29:30.191344041 +0000 UTC m=+1177.505513062" observedRunningTime="2026-03-19 12:29:31.999591996 +0000 UTC m=+1179.313761007" watchObservedRunningTime="2026-03-19 12:29:32.003749154 +0000 UTC m=+1179.317918185" Mar 19 12:29:32.004119 master-0 kubenswrapper[29612]: I0319 12:29:32.003968 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0011ebb3-0c56-4a97-a5cd-ef8560ff56e7","Type":"ContainerStarted","Data":"be6d2b34ebfd292bed5aa1488863cf8ad38fd3594cfcbc58e570bcf7983ba780"} Mar 19 12:29:32.019659 master-0 kubenswrapper[29612]: I0319 12:29:32.018243 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c8pd5" event={"ID":"57ef2db4-4e57-40b2-9293-8691f71495bd","Type":"ContainerStarted","Data":"8966eec733035c3e385a44f3161f24e0d3a36a3cbc079c5cfe5d131649e9dd21"} Mar 19 12:29:32.172726 master-0 kubenswrapper[29612]: I0319 12:29:32.172653 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:32.185078 master-0 kubenswrapper[29612]: I0319 12:29:32.185023 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77b78c7bbd-2n7mk" Mar 19 12:29:32.243993 master-0 kubenswrapper[29612]: I0319 12:29:32.243791 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-97d484c8-dtsbv" podStartSLOduration=17.243746614 podStartE2EDuration="17.243746614s" podCreationTimestamp="2026-03-19 12:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:32.238090553 +0000 UTC m=+1179.552259574" watchObservedRunningTime="2026-03-19 12:29:32.243746614 +0000 UTC m=+1179.557915655" Mar 19 12:29:32.438555 master-0 kubenswrapper[29612]: I0319 12:29:32.438467 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-6ksst" podStartSLOduration=15.438444486 podStartE2EDuration="15.438444486s" podCreationTimestamp="2026-03-19 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:32.410594675 +0000 UTC m=+1179.724763696" watchObservedRunningTime="2026-03-19 12:29:32.438444486 +0000 UTC m=+1179.752613517" Mar 19 12:29:32.645204 master-0 kubenswrapper[29612]: I0319 12:29:32.645056 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.780690914 podStartE2EDuration="24.645037177s" podCreationTimestamp="2026-03-19 12:29:08 +0000 UTC" firstStartedPulling="2026-03-19 12:29:09.306613198 +0000 UTC m=+1156.620782219" lastFinishedPulling="2026-03-19 12:29:30.170959461 +0000 UTC m=+1177.485128482" observedRunningTime="2026-03-19 12:29:32.637222765 +0000 UTC m=+1179.951391796" watchObservedRunningTime="2026-03-19 12:29:32.645037177 +0000 UTC m=+1179.959206198" Mar 19 12:29:32.920252 master-0 kubenswrapper[29612]: I0319 12:29:32.920192 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" path="/var/lib/kubelet/pods/2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2/volumes" Mar 19 12:29:32.921698 master-0 kubenswrapper[29612]: I0319 12:29:32.921682 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" path="/var/lib/kubelet/pods/966779b3-45a4-4c95-8dfb-5e2b42f42cc1/volumes" Mar 19 12:29:33.040120 master-0 kubenswrapper[29612]: I0319 12:29:33.039883 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-97d484c8-dtsbv" podUID="9836e2b7-486b-4c01-ad9a-f63c9a0f08e3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 12:29:33.360022 master-0 kubenswrapper[29612]: I0319 12:29:33.359737 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-c8pd5" podStartSLOduration=17.359719565 podStartE2EDuration="17.359719565s" podCreationTimestamp="2026-03-19 12:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:33.358259684 +0000 UTC m=+1180.672428705" watchObservedRunningTime="2026-03-19 12:29:33.359719565 +0000 UTC m=+1180.673888576" Mar 19 12:29:34.005662 master-0 kubenswrapper[29612]: I0319 12:29:34.005507 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-66dc6f86db-rpr5q"] Mar 19 12:29:34.006316 master-0 kubenswrapper[29612]: I0319 12:29:34.005878 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-66dc6f86db-rpr5q" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-log" containerID="cri-o://c98228c403bfc9077719614a45ae48650c4de1acac68426a85b93ba6d1e3a33c" gracePeriod=30 Mar 19 12:29:34.006316 master-0 kubenswrapper[29612]: I0319 12:29:34.006015 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-66dc6f86db-rpr5q" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-api" containerID="cri-o://3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1" gracePeriod=30 Mar 19 12:29:34.086681 master-0 kubenswrapper[29612]: I0319 12:29:34.085865 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-97d484c8-dtsbv" podUID="9836e2b7-486b-4c01-ad9a-f63c9a0f08e3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 12:29:34.143233 master-0 kubenswrapper[29612]: I0319 12:29:34.142898 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-c6b86788c-dtf24" Mar 19 12:29:35.893627 master-0 kubenswrapper[29612]: I0319 12:29:35.893452 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:29:35.896933 master-0 kubenswrapper[29612]: I0319 12:29:35.893803 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3abc0-default-internal-api-0" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-log" containerID="cri-o://354b6849b85b9548735949c583ed8f9ab43a9f9009c0afe1b9d88606bace8530" gracePeriod=30 Mar 19 12:29:35.896933 master-0 kubenswrapper[29612]: I0319 12:29:35.894411 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-3abc0-default-internal-api-0" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-httpd" containerID="cri-o://b6b9739daa5abc26704e140533520ffa7ea17ad8ac5c63a19d594116782b5d8a" gracePeriod=30 Mar 19 12:29:36.662665 master-0 kubenswrapper[29612]: I0319 12:29:36.662589 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:36.668662 master-0 kubenswrapper[29612]: I0319 12:29:36.668404 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-97d484c8-dtsbv" Mar 19 12:29:37.491005 master-0 kubenswrapper[29612]: W0319 12:29:37.490950 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dd055fb_058b_4a5b_ba29_b376e81cada7.slice/crio-56fc6adaa9c0d55105f7c0ad8844e3ef81600dc5e6601ae73a61f4fc49034287 WatchSource:0}: Error finding container 56fc6adaa9c0d55105f7c0ad8844e3ef81600dc5e6601ae73a61f4fc49034287: Status 404 returned error can't find the container with id 56fc6adaa9c0d55105f7c0ad8844e3ef81600dc5e6601ae73a61f4fc49034287 Mar 19 12:29:37.567860 master-0 kubenswrapper[29612]: I0319 12:29:37.561838 29612 scope.go:117] "RemoveContainer" containerID="e49fb06ec118d446a356cd045b9abb7f736c5d0b5fb9674ec53fc16be70f30d4" Mar 19 12:29:37.713527 master-0 kubenswrapper[29612]: I0319 12:29:37.713476 29612 scope.go:117] "RemoveContainer" containerID="f3871672910c47bbaee8a2f86c2df16fa38270d9b2a678ea64d4250375975575" Mar 19 12:29:37.819942 master-0 kubenswrapper[29612]: I0319 12:29:37.819365 29612 scope.go:117] "RemoveContainer" containerID="436b9e409e3c03f1804e33525028cdf07ca820feb1e1199bc639f5f9b01f3e38" Mar 19 12:29:37.867099 master-0 kubenswrapper[29612]: I0319 12:29:37.863655 29612 scope.go:117] "RemoveContainer" containerID="6d9a0090ce988131f4855c5b4248cb5067f742ec7925c02c435a1445857223af" Mar 19 12:29:37.878036 master-0 kubenswrapper[29612]: E0319 12:29:37.876929 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcee29fd_64dc_454f_830c_d3bd3280a07a.slice/crio-3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcee29fd_64dc_454f_830c_d3bd3280a07a.slice/crio-conmon-3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:29:37.878036 master-0 kubenswrapper[29612]: E0319 12:29:37.877101 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcee29fd_64dc_454f_830c_d3bd3280a07a.slice/crio-conmon-3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcee29fd_64dc_454f_830c_d3bd3280a07a.slice/crio-3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:29:38.224588 master-0 kubenswrapper[29612]: I0319 12:29:38.221990 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" event={"ID":"2dd055fb-058b-4a5b-ba29-b376e81cada7","Type":"ContainerStarted","Data":"4ce9390bef8fe34b52179e6581d3377693475ba86b4c9202dfe8ff755dee5624"} Mar 19 12:29:38.224588 master-0 kubenswrapper[29612]: I0319 12:29:38.222054 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" event={"ID":"2dd055fb-058b-4a5b-ba29-b376e81cada7","Type":"ContainerStarted","Data":"56fc6adaa9c0d55105f7c0ad8844e3ef81600dc5e6601ae73a61f4fc49034287"} Mar 19 12:29:38.237334 master-0 kubenswrapper[29612]: I0319 12:29:38.236386 29612 generic.go:334] "Generic (PLEG): container finished" podID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerID="3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1" exitCode=0 Mar 19 12:29:38.237334 master-0 kubenswrapper[29612]: I0319 12:29:38.236428 29612 generic.go:334] "Generic (PLEG): container finished" podID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerID="c98228c403bfc9077719614a45ae48650c4de1acac68426a85b93ba6d1e3a33c" exitCode=143 Mar 19 12:29:38.237334 master-0 kubenswrapper[29612]: I0319 12:29:38.236472 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66dc6f86db-rpr5q" event={"ID":"fcee29fd-64dc-454f-830c-d3bd3280a07a","Type":"ContainerDied","Data":"3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1"} Mar 19 12:29:38.237334 master-0 kubenswrapper[29612]: I0319 12:29:38.236498 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66dc6f86db-rpr5q" event={"ID":"fcee29fd-64dc-454f-830c-d3bd3280a07a","Type":"ContainerDied","Data":"c98228c403bfc9077719614a45ae48650c4de1acac68426a85b93ba6d1e3a33c"} Mar 19 12:29:38.248737 master-0 kubenswrapper[29612]: I0319 12:29:38.247179 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" podStartSLOduration=17.247134469 podStartE2EDuration="17.247134469s" podCreationTimestamp="2026-03-19 12:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:38.245997607 +0000 UTC m=+1185.560166618" watchObservedRunningTime="2026-03-19 12:29:38.247134469 +0000 UTC m=+1185.561303510" Mar 19 12:29:38.248737 master-0 kubenswrapper[29612]: I0319 12:29:38.248150 29612 generic.go:334] "Generic (PLEG): container finished" podID="9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" containerID="baccff7a238191e9b496ab9f8ea06189590b4e026b36e8d87e59ff8ec07742c2" exitCode=0 Mar 19 12:29:38.248737 master-0 kubenswrapper[29612]: I0319 12:29:38.248217 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rnrf5" event={"ID":"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a","Type":"ContainerDied","Data":"baccff7a238191e9b496ab9f8ea06189590b4e026b36e8d87e59ff8ec07742c2"} Mar 19 12:29:38.277575 master-0 kubenswrapper[29612]: I0319 12:29:38.276792 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:38.277575 master-0 kubenswrapper[29612]: I0319 12:29:38.276923 29612 generic.go:334] "Generic (PLEG): container finished" podID="535fe30f-e49c-4659-a342-978cf8d49d25" containerID="354b6849b85b9548735949c583ed8f9ab43a9f9009c0afe1b9d88606bace8530" exitCode=143 Mar 19 12:29:38.277575 master-0 kubenswrapper[29612]: I0319 12:29:38.276985 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"535fe30f-e49c-4659-a342-978cf8d49d25","Type":"ContainerDied","Data":"354b6849b85b9548735949c583ed8f9ab43a9f9009c0afe1b9d88606bace8530"} Mar 19 12:29:38.299856 master-0 kubenswrapper[29612]: I0319 12:29:38.294161 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" event={"ID":"275eb2ad-4f11-47c4-93dd-b07ba6ef648e","Type":"ContainerStarted","Data":"6ec4a994a578bdb11b174bef7c0f6b9a964818981d668e7dd954b2e0e98fbbd0"} Mar 19 12:29:38.299856 master-0 kubenswrapper[29612]: I0319 12:29:38.294215 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" event={"ID":"275eb2ad-4f11-47c4-93dd-b07ba6ef648e","Type":"ContainerStarted","Data":"29baba9fdcd42511d57f346b87e8c468c892273d395dead0ddd28ca7e8bfe580"} Mar 19 12:29:38.303205 master-0 kubenswrapper[29612]: I0319 12:29:38.303148 29612 generic.go:334] "Generic (PLEG): container finished" podID="57ef2db4-4e57-40b2-9293-8691f71495bd" containerID="8966eec733035c3e385a44f3161f24e0d3a36a3cbc079c5cfe5d131649e9dd21" exitCode=0 Mar 19 12:29:38.303317 master-0 kubenswrapper[29612]: I0319 12:29:38.303208 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c8pd5" event={"ID":"57ef2db4-4e57-40b2-9293-8691f71495bd","Type":"ContainerDied","Data":"8966eec733035c3e385a44f3161f24e0d3a36a3cbc079c5cfe5d131649e9dd21"} Mar 19 12:29:38.304771 master-0 kubenswrapper[29612]: I0319 12:29:38.304730 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-facf-account-create-update-f5254" event={"ID":"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8","Type":"ContainerStarted","Data":"e310f07b97bdde59a52c5d747219a39bdd939f67d5d91a2a7f77e5026b8edfab"} Mar 19 12:29:38.304771 master-0 kubenswrapper[29612]: I0319 12:29:38.304759 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-facf-account-create-update-f5254" event={"ID":"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8","Type":"ContainerStarted","Data":"fb4a5318a4b5160477218fb46f72896f4308428d0ac9537c1081c8a13301cf8c"} Mar 19 12:29:38.312225 master-0 kubenswrapper[29612]: I0319 12:29:38.310042 29612 generic.go:334] "Generic (PLEG): container finished" podID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerID="5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d" exitCode=0 Mar 19 12:29:38.312225 master-0 kubenswrapper[29612]: I0319 12:29:38.310072 29612 generic.go:334] "Generic (PLEG): container finished" podID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerID="bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188" exitCode=143 Mar 19 12:29:38.312225 master-0 kubenswrapper[29612]: I0319 12:29:38.310103 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"f8a1e1d5-9787-4dfc-96fe-367b409dcf64","Type":"ContainerDied","Data":"5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d"} Mar 19 12:29:38.312225 master-0 kubenswrapper[29612]: I0319 12:29:38.310123 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"f8a1e1d5-9787-4dfc-96fe-367b409dcf64","Type":"ContainerDied","Data":"bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188"} Mar 19 12:29:38.312225 master-0 kubenswrapper[29612]: I0319 12:29:38.310138 29612 scope.go:117] "RemoveContainer" containerID="5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d" Mar 19 12:29:38.312225 master-0 kubenswrapper[29612]: I0319 12:29:38.310231 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:38.356685 master-0 kubenswrapper[29612]: I0319 12:29:38.353834 29612 generic.go:334] "Generic (PLEG): container finished" podID="0dff3f9d-f585-4561-a85e-51a6ba900d33" containerID="e4b9a2b00f0d183bbaab95fb3a2754c533cc12558689257201027ab17a96e5a6" exitCode=0 Mar 19 12:29:38.356685 master-0 kubenswrapper[29612]: I0319 12:29:38.353886 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6ksst" event={"ID":"0dff3f9d-f585-4561-a85e-51a6ba900d33","Type":"ContainerDied","Data":"e4b9a2b00f0d183bbaab95fb3a2754c533cc12558689257201027ab17a96e5a6"} Mar 19 12:29:38.393787 master-0 kubenswrapper[29612]: I0319 12:29:38.391169 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-facf-account-create-update-f5254" podStartSLOduration=18.391148131 podStartE2EDuration="18.391148131s" podCreationTimestamp="2026-03-19 12:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:38.387778206 +0000 UTC m=+1185.701947227" watchObservedRunningTime="2026-03-19 12:29:38.391148131 +0000 UTC m=+1185.705317152" Mar 19 12:29:38.402980 master-0 kubenswrapper[29612]: I0319 12:29:38.402512 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:29:38.431946 master-0 kubenswrapper[29612]: I0319 12:29:38.431261 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s82wt\" (UniqueName: \"kubernetes.io/projected/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-kube-api-access-s82wt\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.431946 master-0 kubenswrapper[29612]: I0319 12:29:38.431423 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-combined-ca-bundle\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.431946 master-0 kubenswrapper[29612]: I0319 12:29:38.431494 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-config-data\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.431946 master-0 kubenswrapper[29612]: I0319 12:29:38.431535 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-logs\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.435723 master-0 kubenswrapper[29612]: I0319 12:29:38.434537 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-scripts\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.435723 master-0 kubenswrapper[29612]: I0319 12:29:38.434905 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-public-tls-certs\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.435723 master-0 kubenswrapper[29612]: I0319 12:29:38.434940 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-httpd-run\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.435723 master-0 kubenswrapper[29612]: I0319 12:29:38.435124 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\" (UID: \"f8a1e1d5-9787-4dfc-96fe-367b409dcf64\") " Mar 19 12:29:38.438093 master-0 kubenswrapper[29612]: I0319 12:29:38.437508 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-logs" (OuterVolumeSpecName: "logs") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:38.440733 master-0 kubenswrapper[29612]: I0319 12:29:38.440089 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:38.451022 master-0 kubenswrapper[29612]: I0319 12:29:38.450945 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-scripts" (OuterVolumeSpecName: "scripts") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.456972 master-0 kubenswrapper[29612]: I0319 12:29:38.456907 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-kube-api-access-s82wt" (OuterVolumeSpecName: "kube-api-access-s82wt") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "kube-api-access-s82wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:38.458258 master-0 kubenswrapper[29612]: I0319 12:29:38.458001 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" podStartSLOduration=21.457982791 podStartE2EDuration="21.457982791s" podCreationTimestamp="2026-03-19 12:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:38.443828918 +0000 UTC m=+1185.757997949" watchObservedRunningTime="2026-03-19 12:29:38.457982791 +0000 UTC m=+1185.772151812" Mar 19 12:29:38.513227 master-0 kubenswrapper[29612]: I0319 12:29:38.512211 29612 scope.go:117] "RemoveContainer" containerID="bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188" Mar 19 12:29:38.516621 master-0 kubenswrapper[29612]: I0319 12:29:38.516563 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81" (OuterVolumeSpecName: "glance") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 12:29:38.531090 master-0 kubenswrapper[29612]: I0319 12:29:38.530982 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538307 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9r5x\" (UniqueName: \"kubernetes.io/projected/fcee29fd-64dc-454f-830c-d3bd3280a07a-kube-api-access-f9r5x\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538464 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-internal-tls-certs\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538750 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-config-data\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538783 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcee29fd-64dc-454f-830c-d3bd3280a07a-logs\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538829 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-public-tls-certs\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538918 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-combined-ca-bundle\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.538988 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-scripts\") pod \"fcee29fd-64dc-454f-830c-d3bd3280a07a\" (UID: \"fcee29fd-64dc-454f-830c-d3bd3280a07a\") " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.540207 29612 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.540283 29612 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") on node \"master-0\" " Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.540299 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s82wt\" (UniqueName: \"kubernetes.io/projected/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-kube-api-access-s82wt\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.540309 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.540318 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.545906 master-0 kubenswrapper[29612]: I0319 12:29:38.540328 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.562250 master-0 kubenswrapper[29612]: I0319 12:29:38.546179 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcee29fd-64dc-454f-830c-d3bd3280a07a-logs" (OuterVolumeSpecName: "logs") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:38.562250 master-0 kubenswrapper[29612]: I0319 12:29:38.553113 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-scripts" (OuterVolumeSpecName: "scripts") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.573432 master-0 kubenswrapper[29612]: I0319 12:29:38.573370 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcee29fd-64dc-454f-830c-d3bd3280a07a-kube-api-access-f9r5x" (OuterVolumeSpecName: "kube-api-access-f9r5x") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "kube-api-access-f9r5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:38.597613 master-0 kubenswrapper[29612]: I0319 12:29:38.597563 29612 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 12:29:38.597862 master-0 kubenswrapper[29612]: I0319 12:29:38.597822 29612 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf" (UniqueName: "kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81") on node "master-0" Mar 19 12:29:38.603602 master-0 kubenswrapper[29612]: I0319 12:29:38.603556 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-config-data" (OuterVolumeSpecName: "config-data") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.604948 master-0 kubenswrapper[29612]: I0319 12:29:38.604912 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f8a1e1d5-9787-4dfc-96fe-367b409dcf64" (UID: "f8a1e1d5-9787-4dfc-96fe-367b409dcf64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.636028 master-0 kubenswrapper[29612]: I0319 12:29:38.635436 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.642952 master-0 kubenswrapper[29612]: I0319 12:29:38.642888 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-config-data" (OuterVolumeSpecName: "config-data") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.644209 master-0 kubenswrapper[29612]: I0319 12:29:38.644157 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644209 master-0 kubenswrapper[29612]: I0319 12:29:38.644194 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9r5x\" (UniqueName: \"kubernetes.io/projected/fcee29fd-64dc-454f-830c-d3bd3280a07a-kube-api-access-f9r5x\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644335 master-0 kubenswrapper[29612]: I0319 12:29:38.644211 29612 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644335 master-0 kubenswrapper[29612]: I0319 12:29:38.644228 29612 reconciler_common.go:293] "Volume detached for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644335 master-0 kubenswrapper[29612]: I0319 12:29:38.644241 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644335 master-0 kubenswrapper[29612]: I0319 12:29:38.644252 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcee29fd-64dc-454f-830c-d3bd3280a07a-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644335 master-0 kubenswrapper[29612]: I0319 12:29:38.644263 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8a1e1d5-9787-4dfc-96fe-367b409dcf64-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.644335 master-0 kubenswrapper[29612]: I0319 12:29:38.644274 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.675904 master-0 kubenswrapper[29612]: I0319 12:29:38.675794 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.691361 master-0 kubenswrapper[29612]: I0319 12:29:38.691230 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fcee29fd-64dc-454f-830c-d3bd3280a07a" (UID: "fcee29fd-64dc-454f-830c-d3bd3280a07a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:38.746145 master-0 kubenswrapper[29612]: I0319 12:29:38.746092 29612 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.746145 master-0 kubenswrapper[29612]: I0319 12:29:38.746142 29612 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcee29fd-64dc-454f-830c-d3bd3280a07a-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:38.954308 master-0 kubenswrapper[29612]: I0319 12:29:38.954065 29612 scope.go:117] "RemoveContainer" containerID="5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d" Mar 19 12:29:38.955462 master-0 kubenswrapper[29612]: E0319 12:29:38.955395 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d\": container with ID starting with 5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d not found: ID does not exist" containerID="5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d" Mar 19 12:29:38.955562 master-0 kubenswrapper[29612]: I0319 12:29:38.955463 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d"} err="failed to get container status \"5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d\": rpc error: code = NotFound desc = could not find container \"5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d\": container with ID starting with 5d3b1de04861666ed6f2ce60f81c025fad0b6e8f41ce3501b15bc622a2aba49d not found: ID does not exist" Mar 19 12:29:38.955562 master-0 kubenswrapper[29612]: I0319 12:29:38.955502 29612 scope.go:117] "RemoveContainer" containerID="bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188" Mar 19 12:29:38.957868 master-0 kubenswrapper[29612]: E0319 12:29:38.957802 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188\": container with ID starting with bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188 not found: ID does not exist" containerID="bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188" Mar 19 12:29:38.957970 master-0 kubenswrapper[29612]: I0319 12:29:38.957863 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188"} err="failed to get container status \"bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188\": rpc error: code = NotFound desc = could not find container \"bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188\": container with ID starting with bbeacd1b6896b8791b6e280a11dc843ec84a7274044b4104d3c1cc511e434188 not found: ID does not exist" Mar 19 12:29:38.989384 master-0 kubenswrapper[29612]: I0319 12:29:38.989301 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:29:38.999110 master-0 kubenswrapper[29612]: I0319 12:29:38.999019 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:29:39.031268 master-0 kubenswrapper[29612]: I0319 12:29:39.031174 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:29:39.031684 master-0 kubenswrapper[29612]: E0319 12:29:39.031606 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api" Mar 19 12:29:39.031684 master-0 kubenswrapper[29612]: I0319 12:29:39.031659 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api" Mar 19 12:29:39.031684 master-0 kubenswrapper[29612]: E0319 12:29:39.031675 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api" Mar 19 12:29:39.031684 master-0 kubenswrapper[29612]: I0319 12:29:39.031681 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api" Mar 19 12:29:39.031684 master-0 kubenswrapper[29612]: E0319 12:29:39.031691 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b120a0-2c7e-43de-982d-0edc209ebafa" containerName="mariadb-database-create" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031699 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b120a0-2c7e-43de-982d-0edc209ebafa" containerName="mariadb-database-create" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031727 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031733 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031758 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031764 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031775 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-api" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031781 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-api" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031793 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-httpd" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031799 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-httpd" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031808 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="init" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031814 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="init" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031829 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-api" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031835 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-api" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031843 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031848 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: E0319 12:29:39.031860 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-httpd" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.031866 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-httpd" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032076 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032087 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-httpd" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032096 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032108 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032118 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-log" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032131 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="966779b3-45a4-4c95-8dfb-5e2b42f42cc1" containerName="neutron-api" Mar 19 12:29:39.032101 master-0 kubenswrapper[29612]: I0319 12:29:39.032144 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4fdfe1-2708-47d8-a72f-1a9f4a03caa2" containerName="ironic-api" Mar 19 12:29:39.033226 master-0 kubenswrapper[29612]: I0319 12:29:39.032154 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" containerName="glance-httpd" Mar 19 12:29:39.033226 master-0 kubenswrapper[29612]: I0319 12:29:39.032168 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b120a0-2c7e-43de-982d-0edc209ebafa" containerName="mariadb-database-create" Mar 19 12:29:39.033226 master-0 kubenswrapper[29612]: I0319 12:29:39.032178 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" containerName="placement-api" Mar 19 12:29:39.034327 master-0 kubenswrapper[29612]: I0319 12:29:39.033416 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.055745 master-0 kubenswrapper[29612]: I0319 12:29:39.055692 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 12:29:39.056056 master-0 kubenswrapper[29612]: I0319 12:29:39.056033 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-default-external-config-data" Mar 19 12:29:39.080736 master-0 kubenswrapper[29612]: I0319 12:29:39.079004 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:29:39.159753 master-0 kubenswrapper[29612]: I0319 12:29:39.159680 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a449777-70b3-4d96-a11a-65e87c128c98-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.159994 master-0 kubenswrapper[29612]: I0319 12:29:39.159849 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.159994 master-0 kubenswrapper[29612]: I0319 12:29:39.159958 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.160077 master-0 kubenswrapper[29612]: I0319 12:29:39.160004 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a449777-70b3-4d96-a11a-65e87c128c98-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.160383 master-0 kubenswrapper[29612]: I0319 12:29:39.160353 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgqll\" (UniqueName: \"kubernetes.io/projected/2a449777-70b3-4d96-a11a-65e87c128c98-kube-api-access-rgqll\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.160435 master-0 kubenswrapper[29612]: I0319 12:29:39.160406 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.160489 master-0 kubenswrapper[29612]: I0319 12:29:39.160472 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.160596 master-0 kubenswrapper[29612]: I0319 12:29:39.160578 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263226 master-0 kubenswrapper[29612]: I0319 12:29:39.263133 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263527 master-0 kubenswrapper[29612]: I0319 12:29:39.263253 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a449777-70b3-4d96-a11a-65e87c128c98-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263527 master-0 kubenswrapper[29612]: I0319 12:29:39.263313 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263527 master-0 kubenswrapper[29612]: I0319 12:29:39.263350 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263527 master-0 kubenswrapper[29612]: I0319 12:29:39.263380 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a449777-70b3-4d96-a11a-65e87c128c98-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263882 master-0 kubenswrapper[29612]: I0319 12:29:39.263596 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgqll\" (UniqueName: \"kubernetes.io/projected/2a449777-70b3-4d96-a11a-65e87c128c98-kube-api-access-rgqll\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263882 master-0 kubenswrapper[29612]: I0319 12:29:39.263689 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.263882 master-0 kubenswrapper[29612]: I0319 12:29:39.263742 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.265743 master-0 kubenswrapper[29612]: I0319 12:29:39.264824 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a449777-70b3-4d96-a11a-65e87c128c98-logs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.265743 master-0 kubenswrapper[29612]: I0319 12:29:39.264875 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2a449777-70b3-4d96-a11a-65e87c128c98-httpd-run\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.276319 master-0 kubenswrapper[29612]: I0319 12:29:39.274135 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:29:39.276319 master-0 kubenswrapper[29612]: I0319 12:29:39.274216 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/9a90be70eddb52d75ac120c1599a27499c5565c88a66c71224351799b28411da/globalmount\"" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.276319 master-0 kubenswrapper[29612]: I0319 12:29:39.274394 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-combined-ca-bundle\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.279267 master-0 kubenswrapper[29612]: I0319 12:29:39.278610 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-config-data\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.280833 master-0 kubenswrapper[29612]: I0319 12:29:39.280788 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-scripts\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.281132 master-0 kubenswrapper[29612]: I0319 12:29:39.281079 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a449777-70b3-4d96-a11a-65e87c128c98-public-tls-certs\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.302747 master-0 kubenswrapper[29612]: I0319 12:29:39.302420 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgqll\" (UniqueName: \"kubernetes.io/projected/2a449777-70b3-4d96-a11a-65e87c128c98-kube-api-access-rgqll\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:39.388492 master-0 kubenswrapper[29612]: I0319 12:29:39.386042 29612 generic.go:334] "Generic (PLEG): container finished" podID="2dd055fb-058b-4a5b-ba29-b376e81cada7" containerID="4ce9390bef8fe34b52179e6581d3377693475ba86b4c9202dfe8ff755dee5624" exitCode=0 Mar 19 12:29:39.388492 master-0 kubenswrapper[29612]: I0319 12:29:39.386150 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" event={"ID":"2dd055fb-058b-4a5b-ba29-b376e81cada7","Type":"ContainerDied","Data":"4ce9390bef8fe34b52179e6581d3377693475ba86b4c9202dfe8ff755dee5624"} Mar 19 12:29:39.396684 master-0 kubenswrapper[29612]: I0319 12:29:39.392338 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-66dc6f86db-rpr5q" event={"ID":"fcee29fd-64dc-454f-830c-d3bd3280a07a","Type":"ContainerDied","Data":"6f0614e06d08d9c42c114c7926bd1a843f22224314cb8afef323681aefb94623"} Mar 19 12:29:39.396684 master-0 kubenswrapper[29612]: I0319 12:29:39.392409 29612 scope.go:117] "RemoveContainer" containerID="3e50e7160b7df932582542cb9ca9ede83092cfd6f875e1b4dd83c766c098bee1" Mar 19 12:29:39.396684 master-0 kubenswrapper[29612]: I0319 12:29:39.392574 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-66dc6f86db-rpr5q" Mar 19 12:29:39.419659 master-0 kubenswrapper[29612]: I0319 12:29:39.415483 29612 generic.go:334] "Generic (PLEG): container finished" podID="f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" containerID="e310f07b97bdde59a52c5d747219a39bdd939f67d5d91a2a7f77e5026b8edfab" exitCode=0 Mar 19 12:29:39.419659 master-0 kubenswrapper[29612]: I0319 12:29:39.415577 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-facf-account-create-update-f5254" event={"ID":"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8","Type":"ContainerDied","Data":"e310f07b97bdde59a52c5d747219a39bdd939f67d5d91a2a7f77e5026b8edfab"} Mar 19 12:29:39.449659 master-0 kubenswrapper[29612]: I0319 12:29:39.448278 29612 generic.go:334] "Generic (PLEG): container finished" podID="535fe30f-e49c-4659-a342-978cf8d49d25" containerID="b6b9739daa5abc26704e140533520ffa7ea17ad8ac5c63a19d594116782b5d8a" exitCode=0 Mar 19 12:29:39.449659 master-0 kubenswrapper[29612]: I0319 12:29:39.448393 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"535fe30f-e49c-4659-a342-978cf8d49d25","Type":"ContainerDied","Data":"b6b9739daa5abc26704e140533520ffa7ea17ad8ac5c63a19d594116782b5d8a"} Mar 19 12:29:39.453093 master-0 kubenswrapper[29612]: I0319 12:29:39.452146 29612 scope.go:117] "RemoveContainer" containerID="c98228c403bfc9077719614a45ae48650c4de1acac68426a85b93ba6d1e3a33c" Mar 19 12:29:39.472615 master-0 kubenswrapper[29612]: I0319 12:29:39.467859 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-66dc6f86db-rpr5q"] Mar 19 12:29:39.472615 master-0 kubenswrapper[29612]: I0319 12:29:39.468033 29612 generic.go:334] "Generic (PLEG): container finished" podID="275eb2ad-4f11-47c4-93dd-b07ba6ef648e" containerID="6ec4a994a578bdb11b174bef7c0f6b9a964818981d668e7dd954b2e0e98fbbd0" exitCode=0 Mar 19 12:29:39.472615 master-0 kubenswrapper[29612]: I0319 12:29:39.468085 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" event={"ID":"275eb2ad-4f11-47c4-93dd-b07ba6ef648e","Type":"ContainerDied","Data":"6ec4a994a578bdb11b174bef7c0f6b9a964818981d668e7dd954b2e0e98fbbd0"} Mar 19 12:29:39.518751 master-0 kubenswrapper[29612]: I0319 12:29:39.485840 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-66dc6f86db-rpr5q"] Mar 19 12:29:39.518751 master-0 kubenswrapper[29612]: I0319 12:29:39.490422 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"53c6f7445a82045913711cabdb41fe05eac5b40d67f51a362c2178d0c3090eb5"} Mar 19 12:29:39.921032 master-0 kubenswrapper[29612]: I0319 12:29:39.920983 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:40.002296 master-0 kubenswrapper[29612]: I0319 12:29:40.002228 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-combined-ca-bundle\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.002527 master-0 kubenswrapper[29612]: I0319 12:29:40.002361 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-logs\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.002527 master-0 kubenswrapper[29612]: I0319 12:29:40.002408 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-internal-tls-certs\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.003398 master-0 kubenswrapper[29612]: I0319 12:29:40.003010 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.003398 master-0 kubenswrapper[29612]: I0319 12:29:40.003294 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-scripts\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.004265 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-config-data\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.004324 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-httpd-run\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.004347 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpfrh\" (UniqueName: \"kubernetes.io/projected/535fe30f-e49c-4659-a342-978cf8d49d25-kube-api-access-cpfrh\") pod \"535fe30f-e49c-4659-a342-978cf8d49d25\" (UID: \"535fe30f-e49c-4659-a342-978cf8d49d25\") " Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.004722 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-logs" (OuterVolumeSpecName: "logs") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.005588 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.006599 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.008705 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-scripts" (OuterVolumeSpecName: "scripts") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.011220 master-0 kubenswrapper[29612]: I0319 12:29:40.008595 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535fe30f-e49c-4659-a342-978cf8d49d25-kube-api-access-cpfrh" (OuterVolumeSpecName: "kube-api-access-cpfrh") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "kube-api-access-cpfrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:40.039859 master-0 kubenswrapper[29612]: I0319 12:29:40.039789 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.096354 master-0 kubenswrapper[29612]: I0319 12:29:40.096143 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-config-data" (OuterVolumeSpecName: "config-data") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.108849 master-0 kubenswrapper[29612]: I0319 12:29:40.108102 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.108849 master-0 kubenswrapper[29612]: I0319 12:29:40.108148 29612 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/535fe30f-e49c-4659-a342-978cf8d49d25-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.108849 master-0 kubenswrapper[29612]: I0319 12:29:40.108164 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpfrh\" (UniqueName: \"kubernetes.io/projected/535fe30f-e49c-4659-a342-978cf8d49d25-kube-api-access-cpfrh\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.108849 master-0 kubenswrapper[29612]: I0319 12:29:40.108400 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.108849 master-0 kubenswrapper[29612]: I0319 12:29:40.108412 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.122697 master-0 kubenswrapper[29612]: I0319 12:29:40.122002 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.220831 master-0 kubenswrapper[29612]: I0319 12:29:40.219442 29612 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/535fe30f-e49c-4659-a342-978cf8d49d25-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.221750 master-0 kubenswrapper[29612]: I0319 12:29:40.221415 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:40.236324 master-0 kubenswrapper[29612]: I0319 12:29:40.236250 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.321054 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dff3f9d-f585-4561-a85e-51a6ba900d33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dff3f9d-f585-4561-a85e-51a6ba900d33" (UID: "0dff3f9d-f585-4561-a85e-51a6ba900d33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323192 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dff3f9d-f585-4561-a85e-51a6ba900d33-operator-scripts\") pod \"0dff3f9d-f585-4561-a85e-51a6ba900d33\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323378 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-combined-ca-bundle\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323424 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-etc-podinfo\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323592 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-scripts\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323668 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sflxl\" (UniqueName: \"kubernetes.io/projected/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-kube-api-access-sflxl\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323721 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/0dff3f9d-f585-4561-a85e-51a6ba900d33-kube-api-access-lkccq\") pod \"0dff3f9d-f585-4561-a85e-51a6ba900d33\" (UID: \"0dff3f9d-f585-4561-a85e-51a6ba900d33\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323774 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.323804 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-config\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.324000 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic\") pod \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\" (UID: \"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a\") " Mar 19 12:29:40.325659 master-0 kubenswrapper[29612]: I0319 12:29:40.325247 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dff3f9d-f585-4561-a85e-51a6ba900d33-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.328366 master-0 kubenswrapper[29612]: I0319 12:29:40.326537 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:40.328366 master-0 kubenswrapper[29612]: I0319 12:29:40.326771 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:40.348729 master-0 kubenswrapper[29612]: I0319 12:29:40.338039 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-scripts" (OuterVolumeSpecName: "scripts") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.348729 master-0 kubenswrapper[29612]: I0319 12:29:40.339396 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 12:29:40.348729 master-0 kubenswrapper[29612]: I0319 12:29:40.344160 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-kube-api-access-sflxl" (OuterVolumeSpecName: "kube-api-access-sflxl") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "kube-api-access-sflxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:40.348729 master-0 kubenswrapper[29612]: I0319 12:29:40.346938 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dff3f9d-f585-4561-a85e-51a6ba900d33-kube-api-access-lkccq" (OuterVolumeSpecName: "kube-api-access-lkccq") pod "0dff3f9d-f585-4561-a85e-51a6ba900d33" (UID: "0dff3f9d-f585-4561-a85e-51a6ba900d33"). InnerVolumeSpecName "kube-api-access-lkccq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:40.371567 master-0 kubenswrapper[29612]: I0319 12:29:40.360128 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.371567 master-0 kubenswrapper[29612]: I0319 12:29:40.369743 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-config" (OuterVolumeSpecName: "config") pod "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" (UID: "9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435043 29612 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435082 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435092 29612 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435102 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435111 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sflxl\" (UniqueName: \"kubernetes.io/projected/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-kube-api-access-sflxl\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435122 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkccq\" (UniqueName: \"kubernetes.io/projected/0dff3f9d-f585-4561-a85e-51a6ba900d33-kube-api-access-lkccq\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435131 29612 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.443948 master-0 kubenswrapper[29612]: I0319 12:29:40.435141 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.448335 master-0 kubenswrapper[29612]: I0319 12:29:40.447984 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:40.543674 master-0 kubenswrapper[29612]: I0319 12:29:40.543285 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8d764b8bf-7q659"] Mar 19 12:29:40.543674 master-0 kubenswrapper[29612]: I0319 12:29:40.543375 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57ef2db4-4e57-40b2-9293-8691f71495bd-operator-scripts\") pod \"57ef2db4-4e57-40b2-9293-8691f71495bd\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " Mar 19 12:29:40.543674 master-0 kubenswrapper[29612]: I0319 12:29:40.543667 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76f6f\" (UniqueName: \"kubernetes.io/projected/57ef2db4-4e57-40b2-9293-8691f71495bd-kube-api-access-76f6f\") pod \"57ef2db4-4e57-40b2-9293-8691f71495bd\" (UID: \"57ef2db4-4e57-40b2-9293-8691f71495bd\") " Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: E0319 12:29:40.544522 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dff3f9d-f585-4561-a85e-51a6ba900d33" containerName="mariadb-database-create" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: I0319 12:29:40.544550 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dff3f9d-f585-4561-a85e-51a6ba900d33" containerName="mariadb-database-create" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: E0319 12:29:40.544576 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-httpd" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: I0319 12:29:40.544584 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-httpd" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: E0319 12:29:40.544603 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-log" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: I0319 12:29:40.544609 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-log" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: E0319 12:29:40.544621 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ef2db4-4e57-40b2-9293-8691f71495bd" containerName="mariadb-database-create" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: I0319 12:29:40.544627 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ef2db4-4e57-40b2-9293-8691f71495bd" containerName="mariadb-database-create" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: E0319 12:29:40.544657 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" containerName="ironic-inspector-db-sync" Mar 19 12:29:40.549984 master-0 kubenswrapper[29612]: I0319 12:29:40.544664 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" containerName="ironic-inspector-db-sync" Mar 19 12:29:40.550472 master-0 kubenswrapper[29612]: I0319 12:29:40.550124 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57ef2db4-4e57-40b2-9293-8691f71495bd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57ef2db4-4e57-40b2-9293-8691f71495bd" (UID: "57ef2db4-4e57-40b2-9293-8691f71495bd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:40.556785 master-0 kubenswrapper[29612]: I0319 12:29:40.551226 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-log" Mar 19 12:29:40.556785 master-0 kubenswrapper[29612]: I0319 12:29:40.551288 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ef2db4-4e57-40b2-9293-8691f71495bd" containerName="mariadb-database-create" Mar 19 12:29:40.556785 master-0 kubenswrapper[29612]: I0319 12:29:40.551325 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" containerName="ironic-inspector-db-sync" Mar 19 12:29:40.556785 master-0 kubenswrapper[29612]: I0319 12:29:40.551339 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" containerName="glance-httpd" Mar 19 12:29:40.556785 master-0 kubenswrapper[29612]: I0319 12:29:40.551357 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dff3f9d-f585-4561-a85e-51a6ba900d33" containerName="mariadb-database-create" Mar 19 12:29:40.568670 master-0 kubenswrapper[29612]: I0319 12:29:40.557508 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.581713 master-0 kubenswrapper[29612]: I0319 12:29:40.577042 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ef2db4-4e57-40b2-9293-8691f71495bd-kube-api-access-76f6f" (OuterVolumeSpecName: "kube-api-access-76f6f") pod "57ef2db4-4e57-40b2-9293-8691f71495bd" (UID: "57ef2db4-4e57-40b2-9293-8691f71495bd"). InnerVolumeSpecName "kube-api-access-76f6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:40.609035 master-0 kubenswrapper[29612]: I0319 12:29:40.607703 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8d764b8bf-7q659"] Mar 19 12:29:40.610057 master-0 kubenswrapper[29612]: I0319 12:29:40.610025 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-rnrf5" event={"ID":"9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a","Type":"ContainerDied","Data":"46a4c0bc3dffa290a3b6c7a36bd06d14c97502cf7029bd1182de097d9d948fb0"} Mar 19 12:29:40.610142 master-0 kubenswrapper[29612]: I0319 12:29:40.610129 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46a4c0bc3dffa290a3b6c7a36bd06d14c97502cf7029bd1182de097d9d948fb0" Mar 19 12:29:40.610268 master-0 kubenswrapper[29612]: I0319 12:29:40.610255 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-rnrf5" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648492 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-svc\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648534 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-nb\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648620 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-sb\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648684 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-config\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648737 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvjmf\" (UniqueName: \"kubernetes.io/projected/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-kube-api-access-cvjmf\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648773 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-swift-storage-0\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648904 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57ef2db4-4e57-40b2-9293-8691f71495bd-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.651955 master-0 kubenswrapper[29612]: I0319 12:29:40.648920 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76f6f\" (UniqueName: \"kubernetes.io/projected/57ef2db4-4e57-40b2-9293-8691f71495bd-kube-api-access-76f6f\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:40.669644 master-0 kubenswrapper[29612]: I0319 12:29:40.669582 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:29:40.676765 master-0 kubenswrapper[29612]: I0319 12:29:40.676695 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"535fe30f-e49c-4659-a342-978cf8d49d25","Type":"ContainerDied","Data":"33e5f865b62f8232a156c76c5ff3d4aa2f8f3757778588e1beb58eda1424032b"} Mar 19 12:29:40.676765 master-0 kubenswrapper[29612]: I0319 12:29:40.676769 29612 scope.go:117] "RemoveContainer" containerID="b6b9739daa5abc26704e140533520ffa7ea17ad8ac5c63a19d594116782b5d8a" Mar 19 12:29:40.677116 master-0 kubenswrapper[29612]: I0319 12:29:40.677068 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 19 12:29:40.677204 master-0 kubenswrapper[29612]: I0319 12:29:40.677100 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:40.686818 master-0 kubenswrapper[29612]: I0319 12:29:40.686773 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 19 12:29:40.687529 master-0 kubenswrapper[29612]: I0319 12:29:40.687507 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 19 12:29:40.688673 master-0 kubenswrapper[29612]: I0319 12:29:40.688626 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 19 12:29:40.699007 master-0 kubenswrapper[29612]: I0319 12:29:40.697825 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:29:40.716824 master-0 kubenswrapper[29612]: I0319 12:29:40.715725 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6ksst" event={"ID":"0dff3f9d-f585-4561-a85e-51a6ba900d33","Type":"ContainerDied","Data":"00b75a2f4a51f44b2c0a93e7e4005e5b061f6b8790dd9b3e4c88ecb7c8562bd9"} Mar 19 12:29:40.716824 master-0 kubenswrapper[29612]: I0319 12:29:40.715772 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b75a2f4a51f44b2c0a93e7e4005e5b061f6b8790dd9b3e4c88ecb7c8562bd9" Mar 19 12:29:40.716824 master-0 kubenswrapper[29612]: I0319 12:29:40.715830 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6ksst" Mar 19 12:29:40.739121 master-0 kubenswrapper[29612]: I0319 12:29:40.738966 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c8pd5" event={"ID":"57ef2db4-4e57-40b2-9293-8691f71495bd","Type":"ContainerDied","Data":"7eeaac0af73dd3a85ce7c90bc53367517d5660bf3801ca5c17278fdbdf64066a"} Mar 19 12:29:40.739121 master-0 kubenswrapper[29612]: I0319 12:29:40.739017 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eeaac0af73dd3a85ce7c90bc53367517d5660bf3801ca5c17278fdbdf64066a" Mar 19 12:29:40.749664 master-0 kubenswrapper[29612]: I0319 12:29:40.746720 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c8pd5" Mar 19 12:29:40.751176 master-0 kubenswrapper[29612]: I0319 12:29:40.751120 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-scripts\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.751276 master-0 kubenswrapper[29612]: I0319 12:29:40.751201 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-sb\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.751276 master-0 kubenswrapper[29612]: I0319 12:29:40.751234 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.751406 master-0 kubenswrapper[29612]: I0319 12:29:40.751300 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-config\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.751406 master-0 kubenswrapper[29612]: I0319 12:29:40.751350 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.751406 master-0 kubenswrapper[29612]: I0319 12:29:40.751397 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvjmf\" (UniqueName: \"kubernetes.io/projected/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-kube-api-access-cvjmf\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.751554 master-0 kubenswrapper[29612]: I0319 12:29:40.751444 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-swift-storage-0\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.751604 master-0 kubenswrapper[29612]: I0319 12:29:40.751579 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-svc\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.751672 master-0 kubenswrapper[29612]: I0319 12:29:40.751607 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-nb\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.751672 master-0 kubenswrapper[29612]: I0319 12:29:40.751656 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b268a64f-8029-4f0d-92ec-de0782d81970-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.751786 master-0 kubenswrapper[29612]: I0319 12:29:40.751703 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-config\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.751786 master-0 kubenswrapper[29612]: I0319 12:29:40.751738 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz2vr\" (UniqueName: \"kubernetes.io/projected/b268a64f-8029-4f0d-92ec-de0782d81970-kube-api-access-zz2vr\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.751786 master-0 kubenswrapper[29612]: I0319 12:29:40.751770 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.758784 master-0 kubenswrapper[29612]: I0319 12:29:40.754096 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-config\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.758784 master-0 kubenswrapper[29612]: I0319 12:29:40.754569 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-sb\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.758784 master-0 kubenswrapper[29612]: I0319 12:29:40.755264 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-nb\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.758784 master-0 kubenswrapper[29612]: I0319 12:29:40.756649 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-svc\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.765127 master-0 kubenswrapper[29612]: I0319 12:29:40.765084 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-swift-storage-0\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.773386 master-0 kubenswrapper[29612]: I0319 12:29:40.768524 29612 scope.go:117] "RemoveContainer" containerID="354b6849b85b9548735949c583ed8f9ab43a9f9009c0afe1b9d88606bace8530" Mar 19 12:29:40.787881 master-0 kubenswrapper[29612]: I0319 12:29:40.787821 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvjmf\" (UniqueName: \"kubernetes.io/projected/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-kube-api-access-cvjmf\") pod \"dnsmasq-dns-8d764b8bf-7q659\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.853897 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-scripts\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.853978 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.854075 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.855761 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b268a64f-8029-4f0d-92ec-de0782d81970-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.855830 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-config\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.855864 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz2vr\" (UniqueName: \"kubernetes.io/projected/b268a64f-8029-4f0d-92ec-de0782d81970-kube-api-access-zz2vr\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.855894 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.858966 master-0 kubenswrapper[29612]: I0319 12:29:40.856409 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.859999 master-0 kubenswrapper[29612]: I0319 12:29:40.859932 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.864551 master-0 kubenswrapper[29612]: I0319 12:29:40.864515 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b268a64f-8029-4f0d-92ec-de0782d81970-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.876364 master-0 kubenswrapper[29612]: I0319 12:29:40.873441 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-config\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.879813 master-0 kubenswrapper[29612]: I0319 12:29:40.879736 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.890420 master-0 kubenswrapper[29612]: I0319 12:29:40.890370 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-scripts\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.917872 master-0 kubenswrapper[29612]: I0319 12:29:40.917806 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz2vr\" (UniqueName: \"kubernetes.io/projected/b268a64f-8029-4f0d-92ec-de0782d81970-kube-api-access-zz2vr\") pod \"ironic-inspector-0\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " pod="openstack/ironic-inspector-0" Mar 19 12:29:40.930278 master-0 kubenswrapper[29612]: I0319 12:29:40.930223 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a1e1d5-9787-4dfc-96fe-367b409dcf64" path="/var/lib/kubelet/pods/f8a1e1d5-9787-4dfc-96fe-367b409dcf64/volumes" Mar 19 12:29:40.931466 master-0 kubenswrapper[29612]: I0319 12:29:40.931441 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcee29fd-64dc-454f-830c-d3bd3280a07a" path="/var/lib/kubelet/pods/fcee29fd-64dc-454f-830c-d3bd3280a07a/volumes" Mar 19 12:29:40.941831 master-0 kubenswrapper[29612]: I0319 12:29:40.941782 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:41.075263 master-0 kubenswrapper[29612]: I0319 12:29:41.075214 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 19 12:29:41.172251 master-0 kubenswrapper[29612]: I0319 12:29:41.172204 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:41.269059 master-0 kubenswrapper[29612]: I0319 12:29:41.265964 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs928\" (UniqueName: \"kubernetes.io/projected/2dd055fb-058b-4a5b-ba29-b376e81cada7-kube-api-access-qs928\") pod \"2dd055fb-058b-4a5b-ba29-b376e81cada7\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " Mar 19 12:29:41.269059 master-0 kubenswrapper[29612]: I0319 12:29:41.266461 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dd055fb-058b-4a5b-ba29-b376e81cada7-operator-scripts\") pod \"2dd055fb-058b-4a5b-ba29-b376e81cada7\" (UID: \"2dd055fb-058b-4a5b-ba29-b376e81cada7\") " Mar 19 12:29:41.270522 master-0 kubenswrapper[29612]: I0319 12:29:41.269614 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dd055fb-058b-4a5b-ba29-b376e81cada7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2dd055fb-058b-4a5b-ba29-b376e81cada7" (UID: "2dd055fb-058b-4a5b-ba29-b376e81cada7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:41.271267 master-0 kubenswrapper[29612]: I0319 12:29:41.271229 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2dd055fb-058b-4a5b-ba29-b376e81cada7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:41.316896 master-0 kubenswrapper[29612]: I0319 12:29:41.316156 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dd055fb-058b-4a5b-ba29-b376e81cada7-kube-api-access-qs928" (OuterVolumeSpecName: "kube-api-access-qs928") pod "2dd055fb-058b-4a5b-ba29-b376e81cada7" (UID: "2dd055fb-058b-4a5b-ba29-b376e81cada7"). InnerVolumeSpecName "kube-api-access-qs928". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:41.373333 master-0 kubenswrapper[29612]: I0319 12:29:41.373206 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs928\" (UniqueName: \"kubernetes.io/projected/2dd055fb-058b-4a5b-ba29-b376e81cada7-kube-api-access-qs928\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:41.786962 master-0 kubenswrapper[29612]: I0319 12:29:41.786907 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:41.823841 master-0 kubenswrapper[29612]: I0319 12:29:41.823775 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:41.827751 master-0 kubenswrapper[29612]: I0319 12:29:41.826819 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-facf-account-create-update-f5254" event={"ID":"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8","Type":"ContainerDied","Data":"fb4a5318a4b5160477218fb46f72896f4308428d0ac9537c1081c8a13301cf8c"} Mar 19 12:29:41.827751 master-0 kubenswrapper[29612]: I0319 12:29:41.826917 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb4a5318a4b5160477218fb46f72896f4308428d0ac9537c1081c8a13301cf8c" Mar 19 12:29:41.833553 master-0 kubenswrapper[29612]: I0319 12:29:41.833495 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" event={"ID":"275eb2ad-4f11-47c4-93dd-b07ba6ef648e","Type":"ContainerDied","Data":"29baba9fdcd42511d57f346b87e8c468c892273d395dead0ddd28ca7e8bfe580"} Mar 19 12:29:41.833553 master-0 kubenswrapper[29612]: I0319 12:29:41.833549 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29baba9fdcd42511d57f346b87e8c468c892273d395dead0ddd28ca7e8bfe580" Mar 19 12:29:41.833828 master-0 kubenswrapper[29612]: I0319 12:29:41.833652 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8d17-account-create-update-twzr4" Mar 19 12:29:41.840865 master-0 kubenswrapper[29612]: I0319 12:29:41.840813 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" event={"ID":"2dd055fb-058b-4a5b-ba29-b376e81cada7","Type":"ContainerDied","Data":"56fc6adaa9c0d55105f7c0ad8844e3ef81600dc5e6601ae73a61f4fc49034287"} Mar 19 12:29:41.841865 master-0 kubenswrapper[29612]: I0319 12:29:41.841841 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56fc6adaa9c0d55105f7c0ad8844e3ef81600dc5e6601ae73a61f4fc49034287" Mar 19 12:29:41.842188 master-0 kubenswrapper[29612]: I0319 12:29:41.842170 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e01-account-create-update-9bxmr" Mar 19 12:29:41.909930 master-0 kubenswrapper[29612]: I0319 12:29:41.909267 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8d764b8bf-7q659"] Mar 19 12:29:41.917297 master-0 kubenswrapper[29612]: I0319 12:29:41.917154 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnx66\" (UniqueName: \"kubernetes.io/projected/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-kube-api-access-cnx66\") pod \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " Mar 19 12:29:41.917297 master-0 kubenswrapper[29612]: I0319 12:29:41.917290 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-operator-scripts\") pod \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " Mar 19 12:29:41.917480 master-0 kubenswrapper[29612]: I0319 12:29:41.917467 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-operator-scripts\") pod \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\" (UID: \"275eb2ad-4f11-47c4-93dd-b07ba6ef648e\") " Mar 19 12:29:41.917522 master-0 kubenswrapper[29612]: I0319 12:29:41.917503 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ksxr\" (UniqueName: \"kubernetes.io/projected/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-kube-api-access-4ksxr\") pod \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\" (UID: \"f1ca4272-7dc1-43fc-b90e-eddd21dadbd8\") " Mar 19 12:29:41.918750 master-0 kubenswrapper[29612]: I0319 12:29:41.918385 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" (UID: "f1ca4272-7dc1-43fc-b90e-eddd21dadbd8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:41.919487 master-0 kubenswrapper[29612]: I0319 12:29:41.919447 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "275eb2ad-4f11-47c4-93dd-b07ba6ef648e" (UID: "275eb2ad-4f11-47c4-93dd-b07ba6ef648e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:41.921935 master-0 kubenswrapper[29612]: I0319 12:29:41.921908 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-kube-api-access-cnx66" (OuterVolumeSpecName: "kube-api-access-cnx66") pod "275eb2ad-4f11-47c4-93dd-b07ba6ef648e" (UID: "275eb2ad-4f11-47c4-93dd-b07ba6ef648e"). InnerVolumeSpecName "kube-api-access-cnx66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:41.935747 master-0 kubenswrapper[29612]: I0319 12:29:41.935429 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-kube-api-access-4ksxr" (OuterVolumeSpecName: "kube-api-access-4ksxr") pod "f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" (UID: "f1ca4272-7dc1-43fc-b90e-eddd21dadbd8"). InnerVolumeSpecName "kube-api-access-4ksxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:42.021005 master-0 kubenswrapper[29612]: I0319 12:29:42.020967 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnx66\" (UniqueName: \"kubernetes.io/projected/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-kube-api-access-cnx66\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:42.021229 master-0 kubenswrapper[29612]: I0319 12:29:42.021191 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:42.021390 master-0 kubenswrapper[29612]: I0319 12:29:42.021378 29612 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/275eb2ad-4f11-47c4-93dd-b07ba6ef648e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:42.021474 master-0 kubenswrapper[29612]: I0319 12:29:42.021461 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ksxr\" (UniqueName: \"kubernetes.io/projected/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8-kube-api-access-4ksxr\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:42.098828 master-0 kubenswrapper[29612]: I0319 12:29:42.098391 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:29:42.149056 master-0 kubenswrapper[29612]: W0319 12:29:42.148696 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb268a64f_8029_4f0d_92ec_de0782d81970.slice/crio-28e4269ab97e0e5b3c4166b82944d2420dce965d823079510099e5cdc2f116b4 WatchSource:0}: Error finding container 28e4269ab97e0e5b3c4166b82944d2420dce965d823079510099e5cdc2f116b4: Status 404 returned error can't find the container with id 28e4269ab97e0e5b3c4166b82944d2420dce965d823079510099e5cdc2f116b4 Mar 19 12:29:42.878973 master-0 kubenswrapper[29612]: I0319 12:29:42.878862 29612 generic.go:334] "Generic (PLEG): container finished" podID="b268a64f-8029-4f0d-92ec-de0782d81970" containerID="4dee89d3cc9fe17dc1b02f0c99dc5d10ce433e8dcb3b806c3e1b0a35f2657ae0" exitCode=0 Mar 19 12:29:42.881477 master-0 kubenswrapper[29612]: I0319 12:29:42.881132 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"b268a64f-8029-4f0d-92ec-de0782d81970","Type":"ContainerDied","Data":"4dee89d3cc9fe17dc1b02f0c99dc5d10ce433e8dcb3b806c3e1b0a35f2657ae0"} Mar 19 12:29:42.881477 master-0 kubenswrapper[29612]: I0319 12:29:42.881210 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"b268a64f-8029-4f0d-92ec-de0782d81970","Type":"ContainerStarted","Data":"28e4269ab97e0e5b3c4166b82944d2420dce965d823079510099e5cdc2f116b4"} Mar 19 12:29:42.890072 master-0 kubenswrapper[29612]: I0319 12:29:42.890030 29612 generic.go:334] "Generic (PLEG): container finished" podID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerID="98b53e175a3e8f0139f61cbc1baebb75af664b329568d876bcce9bb4b8886e6b" exitCode=0 Mar 19 12:29:42.890301 master-0 kubenswrapper[29612]: I0319 12:29:42.890288 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-facf-account-create-update-f5254" Mar 19 12:29:42.899928 master-0 kubenswrapper[29612]: I0319 12:29:42.899867 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" event={"ID":"9ed9991a-9940-48ce-bae1-1e8c416ee8f9","Type":"ContainerDied","Data":"98b53e175a3e8f0139f61cbc1baebb75af664b329568d876bcce9bb4b8886e6b"} Mar 19 12:29:42.899928 master-0 kubenswrapper[29612]: I0319 12:29:42.899909 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" event={"ID":"9ed9991a-9940-48ce-bae1-1e8c416ee8f9","Type":"ContainerStarted","Data":"8cfdab9fa6993646dd7110b9e5552f1f25e21e32411cfe2f139a2bf4329a1132"} Mar 19 12:29:43.478977 master-0 kubenswrapper[29612]: I0319 12:29:43.475818 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319" (OuterVolumeSpecName: "glance") pod "535fe30f-e49c-4659-a342-978cf8d49d25" (UID: "535fe30f-e49c-4659-a342-978cf8d49d25"). InnerVolumeSpecName "pvc-7c8db109-1510-46ed-a028-d218e99693f0". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 12:29:43.488756 master-0 kubenswrapper[29612]: I0319 12:29:43.487416 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a3d6ea3a-8949-4f6c-b3f8-f2c1aae3ddaf\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2ac9cc7-fe9d-4eea-948f-6695e33aed81\") pod \"glance-3abc0-default-external-api-0\" (UID: \"2a449777-70b3-4d96-a11a-65e87c128c98\") " pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:43.492692 master-0 kubenswrapper[29612]: I0319 12:29:43.491207 29612 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") on node \"master-0\" " Mar 19 12:29:43.523354 master-0 kubenswrapper[29612]: I0319 12:29:43.523312 29612 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 12:29:43.523628 master-0 kubenswrapper[29612]: I0319 12:29:43.523492 29612 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7c8db109-1510-46ed-a028-d218e99693f0" (UniqueName: "kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319") on node "master-0" Mar 19 12:29:43.595091 master-0 kubenswrapper[29612]: I0319 12:29:43.595035 29612 reconciler_common.go:293] "Volume detached for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:43.617138 master-0 kubenswrapper[29612]: I0319 12:29:43.617048 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:43.809443 master-0 kubenswrapper[29612]: I0319 12:29:43.809355 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:29:43.834261 master-0 kubenswrapper[29612]: I0319 12:29:43.833296 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.875865 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: E0319 12:29:43.876448 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dd055fb-058b-4a5b-ba29-b376e81cada7" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.876463 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dd055fb-058b-4a5b-ba29-b376e81cada7" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: E0319 12:29:43.876478 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275eb2ad-4f11-47c4-93dd-b07ba6ef648e" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.876486 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="275eb2ad-4f11-47c4-93dd-b07ba6ef648e" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: E0319 12:29:43.876510 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.876520 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.876780 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="275eb2ad-4f11-47c4-93dd-b07ba6ef648e" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.876814 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.876853 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dd055fb-058b-4a5b-ba29-b376e81cada7" containerName="mariadb-account-create-update" Mar 19 12:29:43.888203 master-0 kubenswrapper[29612]: I0319 12:29:43.878264 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:43.923022 master-0 kubenswrapper[29612]: I0319 12:29:43.904417 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 12:29:43.923022 master-0 kubenswrapper[29612]: I0319 12:29:43.904674 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-3abc0-default-internal-config-data" Mar 19 12:29:43.975554 master-0 kubenswrapper[29612]: I0319 12:29:43.975487 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:29:44.002939 master-0 kubenswrapper[29612]: I0319 12:29:44.002797 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" event={"ID":"9ed9991a-9940-48ce-bae1-1e8c416ee8f9","Type":"ContainerStarted","Data":"9d5de17295c12332ab0aa0daf6e9af45e52296f6d8f6634d6a5e84c6f055b84b"} Mar 19 12:29:44.003322 master-0 kubenswrapper[29612]: I0319 12:29:44.003294 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:44.032835 master-0 kubenswrapper[29612]: I0319 12:29:44.032771 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033052 master-0 kubenswrapper[29612]: I0319 12:29:44.032842 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033052 master-0 kubenswrapper[29612]: I0319 12:29:44.032870 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/877290a6-cb83-4a91-837b-aa1c8f3da959-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033052 master-0 kubenswrapper[29612]: I0319 12:29:44.032920 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56kjk\" (UniqueName: \"kubernetes.io/projected/877290a6-cb83-4a91-837b-aa1c8f3da959-kube-api-access-56kjk\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033052 master-0 kubenswrapper[29612]: I0319 12:29:44.032969 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033052 master-0 kubenswrapper[29612]: I0319 12:29:44.033007 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033198 master-0 kubenswrapper[29612]: I0319 12:29:44.033070 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877290a6-cb83-4a91-837b-aa1c8f3da959-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.033198 master-0 kubenswrapper[29612]: I0319 12:29:44.033117 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.090022 master-0 kubenswrapper[29612]: I0319 12:29:44.089934 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" podStartSLOduration=4.089910661 podStartE2EDuration="4.089910661s" podCreationTimestamp="2026-03-19 12:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:44.060259198 +0000 UTC m=+1191.374428219" watchObservedRunningTime="2026-03-19 12:29:44.089910661 +0000 UTC m=+1191.404079682" Mar 19 12:29:44.142401 master-0 kubenswrapper[29612]: I0319 12:29:44.142339 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.142644 master-0 kubenswrapper[29612]: I0319 12:29:44.142605 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.142682 master-0 kubenswrapper[29612]: I0319 12:29:44.142658 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.142914 master-0 kubenswrapper[29612]: I0319 12:29:44.142702 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/877290a6-cb83-4a91-837b-aa1c8f3da959-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.144788 master-0 kubenswrapper[29612]: I0319 12:29:44.143973 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56kjk\" (UniqueName: \"kubernetes.io/projected/877290a6-cb83-4a91-837b-aa1c8f3da959-kube-api-access-56kjk\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.144788 master-0 kubenswrapper[29612]: I0319 12:29:44.144082 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.144788 master-0 kubenswrapper[29612]: I0319 12:29:44.144158 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.144788 master-0 kubenswrapper[29612]: I0319 12:29:44.144279 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877290a6-cb83-4a91-837b-aa1c8f3da959-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.146653 master-0 kubenswrapper[29612]: I0319 12:29:44.146600 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/877290a6-cb83-4a91-837b-aa1c8f3da959-httpd-run\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.148654 master-0 kubenswrapper[29612]: I0319 12:29:44.148614 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-internal-tls-certs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.149686 master-0 kubenswrapper[29612]: I0319 12:29:44.149655 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:29:44.152429 master-0 kubenswrapper[29612]: I0319 12:29:44.152406 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877290a6-cb83-4a91-837b-aa1c8f3da959-logs\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.157869 master-0 kubenswrapper[29612]: I0319 12:29:44.157750 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-scripts\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.158505 master-0 kubenswrapper[29612]: I0319 12:29:44.158482 29612 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 12:29:44.158601 master-0 kubenswrapper[29612]: I0319 12:29:44.158583 29612 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d60c87c719cf4283aa56003c1e019d419c362ddaab3f9dd90c8b6d2b36b7086b/globalmount\"" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.169446 master-0 kubenswrapper[29612]: I0319 12:29:44.169411 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-combined-ca-bundle\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.173744 master-0 kubenswrapper[29612]: I0319 12:29:44.170341 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877290a6-cb83-4a91-837b-aa1c8f3da959-config-data\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.210579 master-0 kubenswrapper[29612]: I0319 12:29:44.210533 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56kjk\" (UniqueName: \"kubernetes.io/projected/877290a6-cb83-4a91-837b-aa1c8f3da959-kube-api-access-56kjk\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:44.352456 master-0 kubenswrapper[29612]: I0319 12:29:44.352326 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2td5k"] Mar 19 12:29:44.357103 master-0 kubenswrapper[29612]: I0319 12:29:44.353958 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.378666 master-0 kubenswrapper[29612]: I0319 12:29:44.375792 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2td5k"] Mar 19 12:29:44.382443 master-0 kubenswrapper[29612]: I0319 12:29:44.379062 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 12:29:44.382443 master-0 kubenswrapper[29612]: I0319 12:29:44.379282 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 12:29:44.459656 master-0 kubenswrapper[29612]: I0319 12:29:44.453840 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-config-data\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.459656 master-0 kubenswrapper[29612]: I0319 12:29:44.453906 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.459656 master-0 kubenswrapper[29612]: I0319 12:29:44.454025 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-scripts\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.459656 master-0 kubenswrapper[29612]: I0319 12:29:44.454107 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zmjr\" (UniqueName: \"kubernetes.io/projected/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-kube-api-access-9zmjr\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.555923 master-0 kubenswrapper[29612]: I0319 12:29:44.555866 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-config-data\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.555923 master-0 kubenswrapper[29612]: I0319 12:29:44.555928 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.556147 master-0 kubenswrapper[29612]: I0319 12:29:44.556050 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-scripts\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.556147 master-0 kubenswrapper[29612]: I0319 12:29:44.556129 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zmjr\" (UniqueName: \"kubernetes.io/projected/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-kube-api-access-9zmjr\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.560896 master-0 kubenswrapper[29612]: I0319 12:29:44.559970 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-config-data\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.563478 master-0 kubenswrapper[29612]: I0319 12:29:44.562741 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-scripts\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.564010 master-0 kubenswrapper[29612]: I0319 12:29:44.563969 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.573411 master-0 kubenswrapper[29612]: I0319 12:29:44.573349 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-external-api-0"] Mar 19 12:29:44.578381 master-0 kubenswrapper[29612]: I0319 12:29:44.577138 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zmjr\" (UniqueName: \"kubernetes.io/projected/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-kube-api-access-9zmjr\") pod \"nova-cell0-conductor-db-sync-2td5k\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.744742 master-0 kubenswrapper[29612]: I0319 12:29:44.743936 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:29:44.955418 master-0 kubenswrapper[29612]: I0319 12:29:44.955333 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535fe30f-e49c-4659-a342-978cf8d49d25" path="/var/lib/kubelet/pods/535fe30f-e49c-4659-a342-978cf8d49d25/volumes" Mar 19 12:29:45.017691 master-0 kubenswrapper[29612]: I0319 12:29:45.017108 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"2a449777-70b3-4d96-a11a-65e87c128c98","Type":"ContainerStarted","Data":"c862031ddb5f7b6c900ca7f4c9356aa987bdb0ed5b207edc0b30dda1bac8b8a3"} Mar 19 12:29:45.194732 master-0 kubenswrapper[29612]: I0319 12:29:45.194129 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7c8db109-1510-46ed-a028-d218e99693f0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4e47e527-afad-43c7-82b0-5b6998031319\") pod \"glance-3abc0-default-internal-api-0\" (UID: \"877290a6-cb83-4a91-837b-aa1c8f3da959\") " pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:45.262067 master-0 kubenswrapper[29612]: I0319 12:29:45.261877 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2td5k"] Mar 19 12:29:45.476269 master-0 kubenswrapper[29612]: I0319 12:29:45.476216 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:46.041406 master-0 kubenswrapper[29612]: I0319 12:29:46.036537 29612 generic.go:334] "Generic (PLEG): container finished" podID="8ed03592-eec9-4173-8239-217452bc4579" containerID="53c6f7445a82045913711cabdb41fe05eac5b40d67f51a362c2178d0c3090eb5" exitCode=0 Mar 19 12:29:46.041406 master-0 kubenswrapper[29612]: I0319 12:29:46.036613 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerDied","Data":"53c6f7445a82045913711cabdb41fe05eac5b40d67f51a362c2178d0c3090eb5"} Mar 19 12:29:46.045569 master-0 kubenswrapper[29612]: I0319 12:29:46.045226 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2td5k" event={"ID":"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0","Type":"ContainerStarted","Data":"9912b2219de29456ea81cbf3ed721019eda1e07a4fa85fd6c70aae696de0b11d"} Mar 19 12:29:46.049169 master-0 kubenswrapper[29612]: I0319 12:29:46.049115 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"2a449777-70b3-4d96-a11a-65e87c128c98","Type":"ContainerStarted","Data":"d25d831840f0b653ebbda88364552e041efdd7e5f86f98d8b31eacb4d8811e4e"} Mar 19 12:29:46.049169 master-0 kubenswrapper[29612]: I0319 12:29:46.049159 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-external-api-0" event={"ID":"2a449777-70b3-4d96-a11a-65e87c128c98","Type":"ContainerStarted","Data":"fadc8899a1bef7f4c5215d44aeed0304082f32507e8309fe58987415ea20f67e"} Mar 19 12:29:46.071027 master-0 kubenswrapper[29612]: I0319 12:29:46.070978 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3abc0-default-internal-api-0"] Mar 19 12:29:46.135304 master-0 kubenswrapper[29612]: I0319 12:29:46.135213 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3abc0-default-external-api-0" podStartSLOduration=8.135192612 podStartE2EDuration="8.135192612s" podCreationTimestamp="2026-03-19 12:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:46.119685531 +0000 UTC m=+1193.433854572" watchObservedRunningTime="2026-03-19 12:29:46.135192612 +0000 UTC m=+1193.449361643" Mar 19 12:29:47.068182 master-0 kubenswrapper[29612]: I0319 12:29:47.068110 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"877290a6-cb83-4a91-837b-aa1c8f3da959","Type":"ContainerStarted","Data":"19b68e96a803a952454b0dff0d3672aba402709919a0fe3b7537da52a18283d6"} Mar 19 12:29:47.068182 master-0 kubenswrapper[29612]: I0319 12:29:47.068172 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"877290a6-cb83-4a91-837b-aa1c8f3da959","Type":"ContainerStarted","Data":"126dce7ddfc4e4d5bbf7c315eec3619dfe757e4e5d36c7782cd03f893811a92c"} Mar 19 12:29:48.099102 master-0 kubenswrapper[29612]: I0319 12:29:48.099012 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3abc0-default-internal-api-0" event={"ID":"877290a6-cb83-4a91-837b-aa1c8f3da959","Type":"ContainerStarted","Data":"fad3f71a93a4941dbf9ebd639e39f17d1f0bf88a41408efa558bbc609c5b82ba"} Mar 19 12:29:48.141672 master-0 kubenswrapper[29612]: I0319 12:29:48.138178 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3abc0-default-internal-api-0" podStartSLOduration=5.138160179 podStartE2EDuration="5.138160179s" podCreationTimestamp="2026-03-19 12:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:29:48.131825689 +0000 UTC m=+1195.445994700" watchObservedRunningTime="2026-03-19 12:29:48.138160179 +0000 UTC m=+1195.452329200" Mar 19 12:29:50.943465 master-0 kubenswrapper[29612]: I0319 12:29:50.943410 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:29:51.823560 master-0 kubenswrapper[29612]: I0319 12:29:51.823195 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b89bc666f-bgkf6"] Mar 19 12:29:51.824104 master-0 kubenswrapper[29612]: I0319 12:29:51.823510 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="dnsmasq-dns" containerID="cri-o://d48ec6f1c78748f3ef00cd927a36d4f723e0f677a5c92ce684dccad1dc7b5539" gracePeriod=10 Mar 19 12:29:53.618030 master-0 kubenswrapper[29612]: I0319 12:29:53.617926 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:53.618030 master-0 kubenswrapper[29612]: I0319 12:29:53.618022 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:53.653813 master-0 kubenswrapper[29612]: I0319 12:29:53.653344 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:53.665781 master-0 kubenswrapper[29612]: I0319 12:29:53.665652 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:54.183486 master-0 kubenswrapper[29612]: I0319 12:29:54.182120 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:54.183486 master-0 kubenswrapper[29612]: I0319 12:29:54.182187 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:54.861480 master-0 kubenswrapper[29612]: I0319 12:29:54.861401 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.236:5353: connect: connection refused" Mar 19 12:29:55.479006 master-0 kubenswrapper[29612]: I0319 12:29:55.477987 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:55.479346 master-0 kubenswrapper[29612]: I0319 12:29:55.479038 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:55.515272 master-0 kubenswrapper[29612]: I0319 12:29:55.515222 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:55.571557 master-0 kubenswrapper[29612]: I0319 12:29:55.571435 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:56.218574 master-0 kubenswrapper[29612]: I0319 12:29:56.218502 29612 generic.go:334] "Generic (PLEG): container finished" podID="738eae96-f628-426c-ab32-da270c608a71" containerID="d48ec6f1c78748f3ef00cd927a36d4f723e0f677a5c92ce684dccad1dc7b5539" exitCode=0 Mar 19 12:29:56.219073 master-0 kubenswrapper[29612]: I0319 12:29:56.218564 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" event={"ID":"738eae96-f628-426c-ab32-da270c608a71","Type":"ContainerDied","Data":"d48ec6f1c78748f3ef00cd927a36d4f723e0f677a5c92ce684dccad1dc7b5539"} Mar 19 12:29:56.219458 master-0 kubenswrapper[29612]: I0319 12:29:56.219414 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:56.219539 master-0 kubenswrapper[29612]: I0319 12:29:56.219462 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:56.726416 master-0 kubenswrapper[29612]: I0319 12:29:56.726345 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:56.726695 master-0 kubenswrapper[29612]: I0319 12:29:56.726532 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:29:56.752670 master-0 kubenswrapper[29612]: I0319 12:29:56.752598 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 19 12:29:56.855399 master-0 kubenswrapper[29612]: I0319 12:29:56.854972 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-external-api-0" Mar 19 12:29:57.240022 master-0 kubenswrapper[29612]: I0319 12:29:57.236780 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:29:57.266558 master-0 kubenswrapper[29612]: I0319 12:29:57.266430 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" Mar 19 12:29:57.267801 master-0 kubenswrapper[29612]: I0319 12:29:57.266972 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b89bc666f-bgkf6" event={"ID":"738eae96-f628-426c-ab32-da270c608a71","Type":"ContainerDied","Data":"0fb4ab032112f87207d930d0261c9b661727cb7db8b3d5e4bae62e728ebe07aa"} Mar 19 12:29:57.267801 master-0 kubenswrapper[29612]: I0319 12:29:57.267336 29612 scope.go:117] "RemoveContainer" containerID="d48ec6f1c78748f3ef00cd927a36d4f723e0f677a5c92ce684dccad1dc7b5539" Mar 19 12:29:57.339394 master-0 kubenswrapper[29612]: I0319 12:29:57.339305 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9rwr\" (UniqueName: \"kubernetes.io/projected/738eae96-f628-426c-ab32-da270c608a71-kube-api-access-r9rwr\") pod \"738eae96-f628-426c-ab32-da270c608a71\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " Mar 19 12:29:57.339607 master-0 kubenswrapper[29612]: I0319 12:29:57.339435 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-sb\") pod \"738eae96-f628-426c-ab32-da270c608a71\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " Mar 19 12:29:57.339607 master-0 kubenswrapper[29612]: I0319 12:29:57.339491 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-config\") pod \"738eae96-f628-426c-ab32-da270c608a71\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " Mar 19 12:29:57.339607 master-0 kubenswrapper[29612]: I0319 12:29:57.339592 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-svc\") pod \"738eae96-f628-426c-ab32-da270c608a71\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " Mar 19 12:29:57.339739 master-0 kubenswrapper[29612]: I0319 12:29:57.339618 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-swift-storage-0\") pod \"738eae96-f628-426c-ab32-da270c608a71\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " Mar 19 12:29:57.339779 master-0 kubenswrapper[29612]: I0319 12:29:57.339758 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-nb\") pod \"738eae96-f628-426c-ab32-da270c608a71\" (UID: \"738eae96-f628-426c-ab32-da270c608a71\") " Mar 19 12:29:57.344459 master-0 kubenswrapper[29612]: I0319 12:29:57.344393 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738eae96-f628-426c-ab32-da270c608a71-kube-api-access-r9rwr" (OuterVolumeSpecName: "kube-api-access-r9rwr") pod "738eae96-f628-426c-ab32-da270c608a71" (UID: "738eae96-f628-426c-ab32-da270c608a71"). InnerVolumeSpecName "kube-api-access-r9rwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:57.452660 master-0 kubenswrapper[29612]: I0319 12:29:57.447386 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9rwr\" (UniqueName: \"kubernetes.io/projected/738eae96-f628-426c-ab32-da270c608a71-kube-api-access-r9rwr\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:57.509930 master-0 kubenswrapper[29612]: I0319 12:29:57.509848 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-config" (OuterVolumeSpecName: "config") pod "738eae96-f628-426c-ab32-da270c608a71" (UID: "738eae96-f628-426c-ab32-da270c608a71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:57.519060 master-0 kubenswrapper[29612]: I0319 12:29:57.518989 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "738eae96-f628-426c-ab32-da270c608a71" (UID: "738eae96-f628-426c-ab32-da270c608a71"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:57.523866 master-0 kubenswrapper[29612]: I0319 12:29:57.523807 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "738eae96-f628-426c-ab32-da270c608a71" (UID: "738eae96-f628-426c-ab32-da270c608a71"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:57.526700 master-0 kubenswrapper[29612]: I0319 12:29:57.526656 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "738eae96-f628-426c-ab32-da270c608a71" (UID: "738eae96-f628-426c-ab32-da270c608a71"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:57.532038 master-0 kubenswrapper[29612]: I0319 12:29:57.531979 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "738eae96-f628-426c-ab32-da270c608a71" (UID: "738eae96-f628-426c-ab32-da270c608a71"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:29:57.549605 master-0 kubenswrapper[29612]: I0319 12:29:57.549545 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:57.549605 master-0 kubenswrapper[29612]: I0319 12:29:57.549595 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:57.549605 master-0 kubenswrapper[29612]: I0319 12:29:57.549609 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:57.549605 master-0 kubenswrapper[29612]: I0319 12:29:57.549623 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:57.549605 master-0 kubenswrapper[29612]: I0319 12:29:57.549646 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/738eae96-f628-426c-ab32-da270c608a71-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:57.644212 master-0 kubenswrapper[29612]: I0319 12:29:57.644022 29612 scope.go:117] "RemoveContainer" containerID="0f561c467183903d1b594215f035dfdaceacf55d11f050e3c21a2a1619a75155" Mar 19 12:29:57.794890 master-0 kubenswrapper[29612]: I0319 12:29:57.794741 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b89bc666f-bgkf6"] Mar 19 12:29:57.897752 master-0 kubenswrapper[29612]: I0319 12:29:57.897223 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b89bc666f-bgkf6"] Mar 19 12:29:58.277921 master-0 kubenswrapper[29612]: I0319 12:29:58.277772 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"b2fbcff5f2cfc6cb1a5b6a95976e318aff150d717c269f2c81d2a9490c7e933f"} Mar 19 12:29:58.283288 master-0 kubenswrapper[29612]: I0319 12:29:58.283227 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2td5k" event={"ID":"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0","Type":"ContainerStarted","Data":"10e5d5ffd5c6df90a5bb673c28cc48402c45813674f49695945380ef5d9ac1cd"} Mar 19 12:29:58.286858 master-0 kubenswrapper[29612]: I0319 12:29:58.286808 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"b268a64f-8029-4f0d-92ec-de0782d81970","Type":"ContainerStarted","Data":"ad83484d04c3c92b2599fdae744d767df742b0a1d25ea9f0937d1a1e6847c809"} Mar 19 12:29:58.287045 master-0 kubenswrapper[29612]: I0319 12:29:58.287002 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-inspector-0" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" containerName="inspector-pxe-init" containerID="cri-o://ad83484d04c3c92b2599fdae744d767df742b0a1d25ea9f0937d1a1e6847c809" gracePeriod=60 Mar 19 12:29:58.467046 master-0 kubenswrapper[29612]: I0319 12:29:58.466934 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-2td5k" podStartSLOduration=2.469298286 podStartE2EDuration="14.466920427s" podCreationTimestamp="2026-03-19 12:29:44 +0000 UTC" firstStartedPulling="2026-03-19 12:29:45.270887291 +0000 UTC m=+1192.585056312" lastFinishedPulling="2026-03-19 12:29:57.268509432 +0000 UTC m=+1204.582678453" observedRunningTime="2026-03-19 12:29:58.465523527 +0000 UTC m=+1205.779692548" watchObservedRunningTime="2026-03-19 12:29:58.466920427 +0000 UTC m=+1205.781089448" Mar 19 12:29:58.640386 master-0 kubenswrapper[29612]: I0319 12:29:58.640265 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:58.640386 master-0 kubenswrapper[29612]: I0319 12:29:58.640371 29612 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 12:29:58.646276 master-0 kubenswrapper[29612]: I0319 12:29:58.646224 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-3abc0-default-internal-api-0" Mar 19 12:29:58.938772 master-0 kubenswrapper[29612]: I0319 12:29:58.938603 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738eae96-f628-426c-ab32-da270c608a71" path="/var/lib/kubelet/pods/738eae96-f628-426c-ab32-da270c608a71/volumes" Mar 19 12:29:59.329843 master-0 kubenswrapper[29612]: I0319 12:29:59.329770 29612 generic.go:334] "Generic (PLEG): container finished" podID="b268a64f-8029-4f0d-92ec-de0782d81970" containerID="ad83484d04c3c92b2599fdae744d767df742b0a1d25ea9f0937d1a1e6847c809" exitCode=0 Mar 19 12:29:59.331676 master-0 kubenswrapper[29612]: I0319 12:29:59.329850 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"b268a64f-8029-4f0d-92ec-de0782d81970","Type":"ContainerDied","Data":"ad83484d04c3c92b2599fdae744d767df742b0a1d25ea9f0937d1a1e6847c809"} Mar 19 12:29:59.419935 master-0 kubenswrapper[29612]: I0319 12:29:59.419889 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 19 12:29:59.533042 master-0 kubenswrapper[29612]: I0319 12:29:59.532955 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.533290 master-0 kubenswrapper[29612]: I0319 12:29:59.533071 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b268a64f-8029-4f0d-92ec-de0782d81970-etc-podinfo\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.533815 master-0 kubenswrapper[29612]: I0319 12:29:59.533784 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-combined-ca-bundle\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.533889 master-0 kubenswrapper[29612]: I0319 12:29:59.533856 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz2vr\" (UniqueName: \"kubernetes.io/projected/b268a64f-8029-4f0d-92ec-de0782d81970-kube-api-access-zz2vr\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.543846 master-0 kubenswrapper[29612]: I0319 12:29:59.537993 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b268a64f-8029-4f0d-92ec-de0782d81970-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 12:29:59.543846 master-0 kubenswrapper[29612]: I0319 12:29:59.539263 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:59.543846 master-0 kubenswrapper[29612]: I0319 12:29:59.543725 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-scripts\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.544070 master-0 kubenswrapper[29612]: I0319 12:29:59.543909 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.544070 master-0 kubenswrapper[29612]: I0319 12:29:59.544031 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-config\") pod \"b268a64f-8029-4f0d-92ec-de0782d81970\" (UID: \"b268a64f-8029-4f0d-92ec-de0782d81970\") " Mar 19 12:29:59.547723 master-0 kubenswrapper[29612]: I0319 12:29:59.547610 29612 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:59.547723 master-0 kubenswrapper[29612]: I0319 12:29:59.547661 29612 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b268a64f-8029-4f0d-92ec-de0782d81970-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:59.552092 master-0 kubenswrapper[29612]: I0319 12:29:59.551949 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:29:59.565187 master-0 kubenswrapper[29612]: I0319 12:29:59.565083 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-scripts" (OuterVolumeSpecName: "scripts") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:59.573476 master-0 kubenswrapper[29612]: I0319 12:29:59.573410 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b268a64f-8029-4f0d-92ec-de0782d81970-kube-api-access-zz2vr" (OuterVolumeSpecName: "kube-api-access-zz2vr") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "kube-api-access-zz2vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:29:59.585730 master-0 kubenswrapper[29612]: I0319 12:29:59.582713 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-config" (OuterVolumeSpecName: "config") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:59.610381 master-0 kubenswrapper[29612]: I0319 12:29:59.610304 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b268a64f-8029-4f0d-92ec-de0782d81970" (UID: "b268a64f-8029-4f0d-92ec-de0782d81970"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:29:59.650243 master-0 kubenswrapper[29612]: I0319 12:29:59.650172 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz2vr\" (UniqueName: \"kubernetes.io/projected/b268a64f-8029-4f0d-92ec-de0782d81970-kube-api-access-zz2vr\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:59.650243 master-0 kubenswrapper[29612]: I0319 12:29:59.650229 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:59.650243 master-0 kubenswrapper[29612]: I0319 12:29:59.650245 29612 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/b268a64f-8029-4f0d-92ec-de0782d81970-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:59.650243 master-0 kubenswrapper[29612]: I0319 12:29:59.650257 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:29:59.650617 master-0 kubenswrapper[29612]: I0319 12:29:59.650267 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b268a64f-8029-4f0d-92ec-de0782d81970-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:00.352845 master-0 kubenswrapper[29612]: I0319 12:30:00.352776 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"b268a64f-8029-4f0d-92ec-de0782d81970","Type":"ContainerDied","Data":"28e4269ab97e0e5b3c4166b82944d2420dce965d823079510099e5cdc2f116b4"} Mar 19 12:30:00.352845 master-0 kubenswrapper[29612]: I0319 12:30:00.352842 29612 scope.go:117] "RemoveContainer" containerID="ad83484d04c3c92b2599fdae744d767df742b0a1d25ea9f0937d1a1e6847c809" Mar 19 12:30:00.354000 master-0 kubenswrapper[29612]: I0319 12:30:00.352918 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 19 12:30:00.385020 master-0 kubenswrapper[29612]: I0319 12:30:00.384971 29612 scope.go:117] "RemoveContainer" containerID="4dee89d3cc9fe17dc1b02f0c99dc5d10ce433e8dcb3b806c3e1b0a35f2657ae0" Mar 19 12:30:00.718554 master-0 kubenswrapper[29612]: I0319 12:30:00.718434 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:30:00.758135 master-0 kubenswrapper[29612]: I0319 12:30:00.758063 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.780868 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: E0319 12:30:00.781411 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="dnsmasq-dns" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.781428 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="dnsmasq-dns" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: E0319 12:30:00.781466 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" containerName="inspector-pxe-init" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.781472 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" containerName="inspector-pxe-init" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: E0319 12:30:00.781503 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" containerName="ironic-python-agent-init" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.781509 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" containerName="ironic-python-agent-init" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: E0319 12:30:00.781517 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="init" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.781524 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="init" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.781805 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="738eae96-f628-426c-ab32-da270c608a71" containerName="dnsmasq-dns" Mar 19 12:30:00.782805 master-0 kubenswrapper[29612]: I0319 12:30:00.781822 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" containerName="inspector-pxe-init" Mar 19 12:30:00.794740 master-0 kubenswrapper[29612]: I0319 12:30:00.786048 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 19 12:30:00.794740 master-0 kubenswrapper[29612]: I0319 12:30:00.789295 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 19 12:30:00.794740 master-0 kubenswrapper[29612]: I0319 12:30:00.792255 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 19 12:30:00.794740 master-0 kubenswrapper[29612]: I0319 12:30:00.792463 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 19 12:30:00.794740 master-0 kubenswrapper[29612]: I0319 12:30:00.792592 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 19 12:30:00.794740 master-0 kubenswrapper[29612]: I0319 12:30:00.792734 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 19 12:30:00.829658 master-0 kubenswrapper[29612]: I0319 12:30:00.829208 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:30:00.878919 master-0 kubenswrapper[29612]: I0319 12:30:00.878845 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-config\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.878919 master-0 kubenswrapper[29612]: I0319 12:30:00.878923 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879234 master-0 kubenswrapper[29612]: I0319 12:30:00.878979 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045556aa-c593-488e-b9e8-331f7acb3878-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879234 master-0 kubenswrapper[29612]: I0319 12:30:00.879036 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8899\" (UniqueName: \"kubernetes.io/projected/045556aa-c593-488e-b9e8-331f7acb3878-kube-api-access-d8899\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879234 master-0 kubenswrapper[29612]: I0319 12:30:00.879067 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879234 master-0 kubenswrapper[29612]: I0319 12:30:00.879128 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879234 master-0 kubenswrapper[29612]: I0319 12:30:00.879199 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-scripts\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879474 master-0 kubenswrapper[29612]: I0319 12:30:00.879238 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045556aa-c593-488e-b9e8-331f7acb3878-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.879474 master-0 kubenswrapper[29612]: I0319 12:30:00.879281 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045556aa-c593-488e-b9e8-331f7acb3878-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.922578 master-0 kubenswrapper[29612]: I0319 12:30:00.922516 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b268a64f-8029-4f0d-92ec-de0782d81970" path="/var/lib/kubelet/pods/b268a64f-8029-4f0d-92ec-de0782d81970/volumes" Mar 19 12:30:00.981618 master-0 kubenswrapper[29612]: I0319 12:30:00.981469 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045556aa-c593-488e-b9e8-331f7acb3878-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.981860 master-0 kubenswrapper[29612]: I0319 12:30:00.981680 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-config\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.981860 master-0 kubenswrapper[29612]: I0319 12:30:00.981714 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.981860 master-0 kubenswrapper[29612]: I0319 12:30:00.981764 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045556aa-c593-488e-b9e8-331f7acb3878-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.981860 master-0 kubenswrapper[29612]: I0319 12:30:00.981825 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8899\" (UniqueName: \"kubernetes.io/projected/045556aa-c593-488e-b9e8-331f7acb3878-kube-api-access-d8899\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.981860 master-0 kubenswrapper[29612]: I0319 12:30:00.981850 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.982128 master-0 kubenswrapper[29612]: I0319 12:30:00.981920 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.982128 master-0 kubenswrapper[29612]: I0319 12:30:00.981980 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-scripts\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.982128 master-0 kubenswrapper[29612]: I0319 12:30:00.982011 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045556aa-c593-488e-b9e8-331f7acb3878-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.982502 master-0 kubenswrapper[29612]: I0319 12:30:00.982466 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045556aa-c593-488e-b9e8-331f7acb3878-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.986017 master-0 kubenswrapper[29612]: I0319 12:30:00.985981 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045556aa-c593-488e-b9e8-331f7acb3878-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.990463 master-0 kubenswrapper[29612]: I0319 12:30:00.990405 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.991320 master-0 kubenswrapper[29612]: I0319 12:30:00.991271 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-scripts\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.992260 master-0 kubenswrapper[29612]: I0319 12:30:00.992223 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-config\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.993342 master-0 kubenswrapper[29612]: I0319 12:30:00.993294 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:00.993793 master-0 kubenswrapper[29612]: I0319 12:30:00.993757 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/045556aa-c593-488e-b9e8-331f7acb3878-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:01.001534 master-0 kubenswrapper[29612]: I0319 12:30:01.001445 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045556aa-c593-488e-b9e8-331f7acb3878-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:01.010895 master-0 kubenswrapper[29612]: I0319 12:30:01.010829 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8899\" (UniqueName: \"kubernetes.io/projected/045556aa-c593-488e-b9e8-331f7acb3878-kube-api-access-d8899\") pod \"ironic-inspector-0\" (UID: \"045556aa-c593-488e-b9e8-331f7acb3878\") " pod="openstack/ironic-inspector-0" Mar 19 12:30:01.107476 master-0 kubenswrapper[29612]: I0319 12:30:01.107373 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 19 12:30:02.160768 master-0 kubenswrapper[29612]: I0319 12:30:02.160725 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 19 12:30:02.380376 master-0 kubenswrapper[29612]: I0319 12:30:02.380227 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"62f4e1748969a1c466c3f39534e958998f70674d82d91b546d78acaa3b442fad"} Mar 19 12:30:03.401668 master-0 kubenswrapper[29612]: I0319 12:30:03.401351 29612 generic.go:334] "Generic (PLEG): container finished" podID="045556aa-c593-488e-b9e8-331f7acb3878" containerID="fa98bc09e956fc4ea5464811d7d861ec54a143e501095e7e9c41758a1a465c97" exitCode=0 Mar 19 12:30:03.401668 master-0 kubenswrapper[29612]: I0319 12:30:03.401403 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerDied","Data":"fa98bc09e956fc4ea5464811d7d861ec54a143e501095e7e9c41758a1a465c97"} Mar 19 12:30:04.417885 master-0 kubenswrapper[29612]: I0319 12:30:04.416529 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"9d43f8d3799d969f228edff7ed12e076d4bcf9a654458e5f96a61738a9b3d06e"} Mar 19 12:30:05.428264 master-0 kubenswrapper[29612]: I0319 12:30:05.428186 29612 generic.go:334] "Generic (PLEG): container finished" podID="045556aa-c593-488e-b9e8-331f7acb3878" containerID="9d43f8d3799d969f228edff7ed12e076d4bcf9a654458e5f96a61738a9b3d06e" exitCode=0 Mar 19 12:30:05.428264 master-0 kubenswrapper[29612]: I0319 12:30:05.428255 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerDied","Data":"9d43f8d3799d969f228edff7ed12e076d4bcf9a654458e5f96a61738a9b3d06e"} Mar 19 12:30:06.454507 master-0 kubenswrapper[29612]: I0319 12:30:06.454433 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"1de5c1382f36858091045174e44c1272b758fea11f25eaf67206e3f29ac5ecb9"} Mar 19 12:30:07.482388 master-0 kubenswrapper[29612]: I0319 12:30:07.482310 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"21072ef76bc69d14d8880f1010a19dd452c93c7c87044518df708c54fab73a47"} Mar 19 12:30:07.482388 master-0 kubenswrapper[29612]: I0319 12:30:07.482382 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"a8f35789afa6363439680f90991c1dedb3347cf8df5cf26fe07347892ce079af"} Mar 19 12:30:08.526922 master-0 kubenswrapper[29612]: I0319 12:30:08.526591 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"83345b4265c3e30b3ecf87dbc8683d720e353ebaeffaf07aec509bdd7ee7b59d"} Mar 19 12:30:08.526922 master-0 kubenswrapper[29612]: I0319 12:30:08.526845 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 19 12:30:08.526922 master-0 kubenswrapper[29612]: I0319 12:30:08.526858 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045556aa-c593-488e-b9e8-331f7acb3878","Type":"ContainerStarted","Data":"e792017258c90f76a4f01577aee55803a71a01f7740f281046dd87fb1c3ec2b0"} Mar 19 12:30:08.526922 master-0 kubenswrapper[29612]: I0319 12:30:08.526870 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 19 12:30:08.654752 master-0 kubenswrapper[29612]: I0319 12:30:08.654608 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=8.654587265 podStartE2EDuration="8.654587265s" podCreationTimestamp="2026-03-19 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:08.651370174 +0000 UTC m=+1215.965539195" watchObservedRunningTime="2026-03-19 12:30:08.654587265 +0000 UTC m=+1215.968756296" Mar 19 12:30:11.108417 master-0 kubenswrapper[29612]: I0319 12:30:11.108335 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.108417 master-0 kubenswrapper[29612]: I0319 12:30:11.108419 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.108417 master-0 kubenswrapper[29612]: I0319 12:30:11.108434 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.109684 master-0 kubenswrapper[29612]: I0319 12:30:11.108448 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.214152 master-0 kubenswrapper[29612]: I0319 12:30:11.214035 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.219273 master-0 kubenswrapper[29612]: I0319 12:30:11.219181 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.223952 master-0 kubenswrapper[29612]: I0319 12:30:11.223912 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.565044 master-0 kubenswrapper[29612]: I0319 12:30:11.564975 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 19 12:30:11.567374 master-0 kubenswrapper[29612]: I0319 12:30:11.567329 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 19 12:30:12.568627 master-0 kubenswrapper[29612]: I0319 12:30:12.568578 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 19 12:30:16.644801 master-0 kubenswrapper[29612]: I0319 12:30:16.643490 29612 generic.go:334] "Generic (PLEG): container finished" podID="dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" containerID="10e5d5ffd5c6df90a5bb673c28cc48402c45813674f49695945380ef5d9ac1cd" exitCode=0 Mar 19 12:30:16.644801 master-0 kubenswrapper[29612]: I0319 12:30:16.643550 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2td5k" event={"ID":"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0","Type":"ContainerDied","Data":"10e5d5ffd5c6df90a5bb673c28cc48402c45813674f49695945380ef5d9ac1cd"} Mar 19 12:30:18.119629 master-0 kubenswrapper[29612]: I0319 12:30:18.119582 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:30:18.264502 master-0 kubenswrapper[29612]: I0319 12:30:18.264443 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-scripts\") pod \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " Mar 19 12:30:18.264764 master-0 kubenswrapper[29612]: I0319 12:30:18.264605 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zmjr\" (UniqueName: \"kubernetes.io/projected/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-kube-api-access-9zmjr\") pod \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " Mar 19 12:30:18.264764 master-0 kubenswrapper[29612]: I0319 12:30:18.264746 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-combined-ca-bundle\") pod \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " Mar 19 12:30:18.264867 master-0 kubenswrapper[29612]: I0319 12:30:18.264782 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-config-data\") pod \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\" (UID: \"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0\") " Mar 19 12:30:18.267836 master-0 kubenswrapper[29612]: I0319 12:30:18.267796 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-kube-api-access-9zmjr" (OuterVolumeSpecName: "kube-api-access-9zmjr") pod "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" (UID: "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0"). InnerVolumeSpecName "kube-api-access-9zmjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:30:18.269706 master-0 kubenswrapper[29612]: I0319 12:30:18.269548 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-scripts" (OuterVolumeSpecName: "scripts") pod "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" (UID: "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:18.298726 master-0 kubenswrapper[29612]: I0319 12:30:18.296949 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" (UID: "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:18.298726 master-0 kubenswrapper[29612]: I0319 12:30:18.298392 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-config-data" (OuterVolumeSpecName: "config-data") pod "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" (UID: "dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:18.367670 master-0 kubenswrapper[29612]: I0319 12:30:18.367597 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:18.367821 master-0 kubenswrapper[29612]: I0319 12:30:18.367731 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zmjr\" (UniqueName: \"kubernetes.io/projected/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-kube-api-access-9zmjr\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:18.367821 master-0 kubenswrapper[29612]: I0319 12:30:18.367753 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:18.367821 master-0 kubenswrapper[29612]: I0319 12:30:18.367765 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:18.673746 master-0 kubenswrapper[29612]: I0319 12:30:18.673607 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-2td5k" event={"ID":"dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0","Type":"ContainerDied","Data":"9912b2219de29456ea81cbf3ed721019eda1e07a4fa85fd6c70aae696de0b11d"} Mar 19 12:30:18.673746 master-0 kubenswrapper[29612]: I0319 12:30:18.673748 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9912b2219de29456ea81cbf3ed721019eda1e07a4fa85fd6c70aae696de0b11d" Mar 19 12:30:18.673746 master-0 kubenswrapper[29612]: I0319 12:30:18.673681 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-2td5k" Mar 19 12:30:18.962844 master-0 kubenswrapper[29612]: I0319 12:30:18.962712 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 12:30:18.963352 master-0 kubenswrapper[29612]: E0319 12:30:18.963321 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" containerName="nova-cell0-conductor-db-sync" Mar 19 12:30:18.963352 master-0 kubenswrapper[29612]: I0319 12:30:18.963346 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" containerName="nova-cell0-conductor-db-sync" Mar 19 12:30:18.963628 master-0 kubenswrapper[29612]: I0319 12:30:18.963596 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" containerName="nova-cell0-conductor-db-sync" Mar 19 12:30:18.964500 master-0 kubenswrapper[29612]: I0319 12:30:18.964471 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:18.966820 master-0 kubenswrapper[29612]: I0319 12:30:18.966783 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 12:30:18.977585 master-0 kubenswrapper[29612]: I0319 12:30:18.977522 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 12:30:19.087968 master-0 kubenswrapper[29612]: I0319 12:30:19.087907 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.088178 master-0 kubenswrapper[29612]: I0319 12:30:19.088018 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.088223 master-0 kubenswrapper[29612]: I0319 12:30:19.088184 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtl6\" (UniqueName: \"kubernetes.io/projected/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-kube-api-access-lgtl6\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.189988 master-0 kubenswrapper[29612]: I0319 12:30:19.189911 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.190543 master-0 kubenswrapper[29612]: I0319 12:30:19.190006 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.190543 master-0 kubenswrapper[29612]: I0319 12:30:19.190108 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtl6\" (UniqueName: \"kubernetes.io/projected/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-kube-api-access-lgtl6\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.202357 master-0 kubenswrapper[29612]: I0319 12:30:19.202299 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.202580 master-0 kubenswrapper[29612]: I0319 12:30:19.202388 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.214145 master-0 kubenswrapper[29612]: I0319 12:30:19.211846 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgtl6\" (UniqueName: \"kubernetes.io/projected/e5e5ed9c-6439-41a9-8970-d3adeb5d990c-kube-api-access-lgtl6\") pod \"nova-cell0-conductor-0\" (UID: \"e5e5ed9c-6439-41a9-8970-d3adeb5d990c\") " pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:19.284360 master-0 kubenswrapper[29612]: I0319 12:30:19.284293 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:20.137002 master-0 kubenswrapper[29612]: I0319 12:30:20.136935 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 12:30:20.697533 master-0 kubenswrapper[29612]: I0319 12:30:20.697481 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e5e5ed9c-6439-41a9-8970-d3adeb5d990c","Type":"ContainerStarted","Data":"ddc58b95a6c14ed0c9f9bac41e308fecda1abd3ebc7dcd1887a55cee6d64a977"} Mar 19 12:30:20.697533 master-0 kubenswrapper[29612]: I0319 12:30:20.697537 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e5e5ed9c-6439-41a9-8970-d3adeb5d990c","Type":"ContainerStarted","Data":"a7024030ca0afd953a7c23b6ca26e248f088e78f9a54221aac590c0cc4cdcac2"} Mar 19 12:30:20.698976 master-0 kubenswrapper[29612]: I0319 12:30:20.698950 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:20.726654 master-0 kubenswrapper[29612]: I0319 12:30:20.726293 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.726271792 podStartE2EDuration="2.726271792s" podCreationTimestamp="2026-03-19 12:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:20.719728506 +0000 UTC m=+1228.033897537" watchObservedRunningTime="2026-03-19 12:30:20.726271792 +0000 UTC m=+1228.040440823" Mar 19 12:30:29.346103 master-0 kubenswrapper[29612]: I0319 12:30:29.345947 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 12:30:29.979334 master-0 kubenswrapper[29612]: I0319 12:30:29.979273 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-82rhf"] Mar 19 12:30:29.981391 master-0 kubenswrapper[29612]: I0319 12:30:29.981339 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:29.984092 master-0 kubenswrapper[29612]: I0319 12:30:29.984043 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 12:30:29.984784 master-0 kubenswrapper[29612]: I0319 12:30:29.984748 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 12:30:30.003793 master-0 kubenswrapper[29612]: I0319 12:30:30.003720 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-82rhf"] Mar 19 12:30:30.077116 master-0 kubenswrapper[29612]: I0319 12:30:30.072072 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-scripts\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.077116 master-0 kubenswrapper[29612]: I0319 12:30:30.072175 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-config-data\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.083102 master-0 kubenswrapper[29612]: I0319 12:30:30.082150 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.083102 master-0 kubenswrapper[29612]: I0319 12:30:30.082216 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwr7\" (UniqueName: \"kubernetes.io/projected/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-kube-api-access-9kwr7\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.196301 master-0 kubenswrapper[29612]: I0319 12:30:30.196118 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-config-data\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.196301 master-0 kubenswrapper[29612]: I0319 12:30:30.196272 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.196301 master-0 kubenswrapper[29612]: I0319 12:30:30.196297 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwr7\" (UniqueName: \"kubernetes.io/projected/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-kube-api-access-9kwr7\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.196690 master-0 kubenswrapper[29612]: I0319 12:30:30.196402 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-scripts\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.202446 master-0 kubenswrapper[29612]: I0319 12:30:30.202364 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 19 12:30:30.206507 master-0 kubenswrapper[29612]: I0319 12:30:30.205620 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.219384 master-0 kubenswrapper[29612]: I0319 12:30:30.218419 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 19 12:30:30.219384 master-0 kubenswrapper[29612]: I0319 12:30:30.218718 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-config-data\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.224728 master-0 kubenswrapper[29612]: I0319 12:30:30.221223 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.224728 master-0 kubenswrapper[29612]: I0319 12:30:30.221540 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-scripts\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.232488 master-0 kubenswrapper[29612]: I0319 12:30:30.232446 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwr7\" (UniqueName: \"kubernetes.io/projected/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-kube-api-access-9kwr7\") pod \"nova-cell0-cell-mapping-82rhf\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.327748 master-0 kubenswrapper[29612]: I0319 12:30:30.322907 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:30.335874 master-0 kubenswrapper[29612]: I0319 12:30:30.332067 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 19 12:30:30.412000 master-0 kubenswrapper[29612]: I0319 12:30:30.411025 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46676b89-9cf1-4ceb-9feb-af82e8442ccb-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.412000 master-0 kubenswrapper[29612]: I0319 12:30:30.411121 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lz55\" (UniqueName: \"kubernetes.io/projected/46676b89-9cf1-4ceb-9feb-af82e8442ccb-kube-api-access-9lz55\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.412000 master-0 kubenswrapper[29612]: I0319 12:30:30.411282 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46676b89-9cf1-4ceb-9feb-af82e8442ccb-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.513418 master-0 kubenswrapper[29612]: I0319 12:30:30.513355 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46676b89-9cf1-4ceb-9feb-af82e8442ccb-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.513682 master-0 kubenswrapper[29612]: I0319 12:30:30.513448 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lz55\" (UniqueName: \"kubernetes.io/projected/46676b89-9cf1-4ceb-9feb-af82e8442ccb-kube-api-access-9lz55\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.513682 master-0 kubenswrapper[29612]: I0319 12:30:30.513574 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46676b89-9cf1-4ceb-9feb-af82e8442ccb-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.520742 master-0 kubenswrapper[29612]: I0319 12:30:30.520685 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46676b89-9cf1-4ceb-9feb-af82e8442ccb-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.548966 master-0 kubenswrapper[29612]: I0319 12:30:30.548923 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46676b89-9cf1-4ceb-9feb-af82e8442ccb-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.556246 master-0 kubenswrapper[29612]: I0319 12:30:30.556072 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lz55\" (UniqueName: \"kubernetes.io/projected/46676b89-9cf1-4ceb-9feb-af82e8442ccb-kube-api-access-9lz55\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"46676b89-9cf1-4ceb-9feb-af82e8442ccb\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.588544 master-0 kubenswrapper[29612]: I0319 12:30:30.585608 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:30.588544 master-0 kubenswrapper[29612]: I0319 12:30:30.587807 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:30:30.620176 master-0 kubenswrapper[29612]: I0319 12:30:30.619926 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:30.642873 master-0 kubenswrapper[29612]: I0319 12:30:30.638061 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 12:30:30.642873 master-0 kubenswrapper[29612]: I0319 12:30:30.638227 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:30:30.642873 master-0 kubenswrapper[29612]: I0319 12:30:30.640199 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:30:30.642873 master-0 kubenswrapper[29612]: I0319 12:30:30.642473 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 12:30:30.662771 master-0 kubenswrapper[29612]: I0319 12:30:30.662716 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:30.666806 master-0 kubenswrapper[29612]: I0319 12:30:30.666144 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:30:30.676934 master-0 kubenswrapper[29612]: I0319 12:30:30.675045 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 12:30:30.678264 master-0 kubenswrapper[29612]: I0319 12:30:30.678232 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:30:30.712082 master-0 kubenswrapper[29612]: I0319 12:30:30.707119 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:30.728791 master-0 kubenswrapper[29612]: I0319 12:30:30.728570 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwrk\" (UniqueName: \"kubernetes.io/projected/f0982f57-e690-4df2-b3fe-23db650e0014-kube-api-access-qlwrk\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.731620 master-0 kubenswrapper[29612]: I0319 12:30:30.731015 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03fbb8c-07ca-4577-b654-25a4d25fab85-logs\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.735875 master-0 kubenswrapper[29612]: I0319 12:30:30.735733 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-config-data\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.735875 master-0 kubenswrapper[29612]: I0319 12:30:30.735795 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4b1340-dfba-4400-8dbe-452dc3de07c9-logs\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.735875 master-0 kubenswrapper[29612]: I0319 12:30:30.735868 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wwcp\" (UniqueName: \"kubernetes.io/projected/b03fbb8c-07ca-4577-b654-25a4d25fab85-kube-api-access-6wwcp\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.736047 master-0 kubenswrapper[29612]: I0319 12:30:30.735921 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.736047 master-0 kubenswrapper[29612]: I0319 12:30:30.736042 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.736120 master-0 kubenswrapper[29612]: I0319 12:30:30.736061 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2gj\" (UniqueName: \"kubernetes.io/projected/df4b1340-dfba-4400-8dbe-452dc3de07c9-kube-api-access-nh2gj\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.736120 master-0 kubenswrapper[29612]: I0319 12:30:30.736088 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.737406 master-0 kubenswrapper[29612]: I0319 12:30:30.736235 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-config-data\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.737406 master-0 kubenswrapper[29612]: I0319 12:30:30.736279 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-config-data\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.737406 master-0 kubenswrapper[29612]: I0319 12:30:30.731935 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58cf5bfdcc-55c55"] Mar 19 12:30:30.742586 master-0 kubenswrapper[29612]: I0319 12:30:30.742058 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.758768 master-0 kubenswrapper[29612]: I0319 12:30:30.751363 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58cf5bfdcc-55c55"] Mar 19 12:30:30.762418 master-0 kubenswrapper[29612]: I0319 12:30:30.762372 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:30:30.767778 master-0 kubenswrapper[29612]: I0319 12:30:30.767738 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.771027 master-0 kubenswrapper[29612]: I0319 12:30:30.769367 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 12:30:30.776038 master-0 kubenswrapper[29612]: I0319 12:30:30.775273 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:30:30.815653 master-0 kubenswrapper[29612]: I0319 12:30:30.813227 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839110 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839217 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03fbb8c-07ca-4577-b654-25a4d25fab85-logs\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839254 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-swift-storage-0\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839286 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-config-data\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839319 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4b1340-dfba-4400-8dbe-452dc3de07c9-logs\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839372 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wwcp\" (UniqueName: \"kubernetes.io/projected/b03fbb8c-07ca-4577-b654-25a4d25fab85-kube-api-access-6wwcp\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839423 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839513 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839547 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839575 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2gj\" (UniqueName: \"kubernetes.io/projected/df4b1340-dfba-4400-8dbe-452dc3de07c9-kube-api-access-nh2gj\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839603 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-nb\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839652 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839707 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-svc\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839796 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-sb\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839836 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-config-data\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839877 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-config-data\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.839903 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlw27\" (UniqueName: \"kubernetes.io/projected/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-kube-api-access-mlw27\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.841410 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-config\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.841495 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwrk\" (UniqueName: \"kubernetes.io/projected/f0982f57-e690-4df2-b3fe-23db650e0014-kube-api-access-qlwrk\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.841500 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03fbb8c-07ca-4577-b654-25a4d25fab85-logs\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.843486 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt2cw\" (UniqueName: \"kubernetes.io/projected/8b676207-2926-4cb6-8f24-14859eee1d2f-kube-api-access-jt2cw\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.843938 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4b1340-dfba-4400-8dbe-452dc3de07c9-logs\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.849720 master-0 kubenswrapper[29612]: I0319 12:30:30.848758 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.860757 master-0 kubenswrapper[29612]: I0319 12:30:30.851265 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-config-data\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.860757 master-0 kubenswrapper[29612]: I0319 12:30:30.851560 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-config-data\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.860757 master-0 kubenswrapper[29612]: I0319 12:30:30.854034 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-config-data\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.860757 master-0 kubenswrapper[29612]: I0319 12:30:30.860605 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwrk\" (UniqueName: \"kubernetes.io/projected/f0982f57-e690-4df2-b3fe-23db650e0014-kube-api-access-qlwrk\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.866260 master-0 kubenswrapper[29612]: I0319 12:30:30.866221 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2gj\" (UniqueName: \"kubernetes.io/projected/df4b1340-dfba-4400-8dbe-452dc3de07c9-kube-api-access-nh2gj\") pod \"nova-metadata-0\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " pod="openstack/nova-metadata-0" Mar 19 12:30:30.866571 master-0 kubenswrapper[29612]: I0319 12:30:30.866530 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.866614 master-0 kubenswrapper[29612]: I0319 12:30:30.866547 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wwcp\" (UniqueName: \"kubernetes.io/projected/b03fbb8c-07ca-4577-b654-25a4d25fab85-kube-api-access-6wwcp\") pod \"nova-api-0\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " pod="openstack/nova-api-0" Mar 19 12:30:30.867799 master-0 kubenswrapper[29612]: I0319 12:30:30.867476 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:30.949751 master-0 kubenswrapper[29612]: I0319 12:30:30.949702 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.949751 master-0 kubenswrapper[29612]: I0319 12:30:30.949753 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-nb\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.949994 master-0 kubenswrapper[29612]: I0319 12:30:30.949791 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-svc\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.949994 master-0 kubenswrapper[29612]: I0319 12:30:30.949840 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-sb\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.949994 master-0 kubenswrapper[29612]: I0319 12:30:30.949883 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlw27\" (UniqueName: \"kubernetes.io/projected/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-kube-api-access-mlw27\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.949994 master-0 kubenswrapper[29612]: I0319 12:30:30.949922 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-config\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.949994 master-0 kubenswrapper[29612]: I0319 12:30:30.949988 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt2cw\" (UniqueName: \"kubernetes.io/projected/8b676207-2926-4cb6-8f24-14859eee1d2f-kube-api-access-jt2cw\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.950191 master-0 kubenswrapper[29612]: I0319 12:30:30.950010 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.950191 master-0 kubenswrapper[29612]: I0319 12:30:30.950033 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-swift-storage-0\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.950988 master-0 kubenswrapper[29612]: I0319 12:30:30.950952 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-swift-storage-0\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.951208 master-0 kubenswrapper[29612]: I0319 12:30:30.951170 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-sb\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.953884 master-0 kubenswrapper[29612]: I0319 12:30:30.953849 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-nb\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.953987 master-0 kubenswrapper[29612]: I0319 12:30:30.953963 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-config\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.956013 master-0 kubenswrapper[29612]: I0319 12:30:30.955951 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.961036 master-0 kubenswrapper[29612]: I0319 12:30:30.957445 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.974702 master-0 kubenswrapper[29612]: I0319 12:30:30.974301 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-svc\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.980792 master-0 kubenswrapper[29612]: I0319 12:30:30.980759 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt2cw\" (UniqueName: \"kubernetes.io/projected/8b676207-2926-4cb6-8f24-14859eee1d2f-kube-api-access-jt2cw\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:30.981525 master-0 kubenswrapper[29612]: I0319 12:30:30.981480 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlw27\" (UniqueName: \"kubernetes.io/projected/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-kube-api-access-mlw27\") pod \"dnsmasq-dns-58cf5bfdcc-55c55\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:30.988933 master-0 kubenswrapper[29612]: I0319 12:30:30.988883 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:30:31.029672 master-0 kubenswrapper[29612]: I0319 12:30:31.026504 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:30:31.047687 master-0 kubenswrapper[29612]: I0319 12:30:31.045509 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:30:31.110940 master-0 kubenswrapper[29612]: I0319 12:30:31.104176 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:31.110940 master-0 kubenswrapper[29612]: I0319 12:30:31.110748 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:31.160592 master-0 kubenswrapper[29612]: I0319 12:30:31.158339 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-82rhf"] Mar 19 12:30:31.394063 master-0 kubenswrapper[29612]: I0319 12:30:31.393740 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bb8d6"] Mar 19 12:30:31.400814 master-0 kubenswrapper[29612]: I0319 12:30:31.400660 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.412852 master-0 kubenswrapper[29612]: I0319 12:30:31.408797 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 12:30:31.455409 master-0 kubenswrapper[29612]: I0319 12:30:31.442980 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 12:30:31.467747 master-0 kubenswrapper[29612]: I0319 12:30:31.460802 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bb8d6"] Mar 19 12:30:31.467747 master-0 kubenswrapper[29612]: W0319 12:30:31.463732 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46676b89_9cf1_4ceb_9feb_af82e8442ccb.slice/crio-87a97fee929040ffb58fc27f08454a1dc88d7ea11eea8eb28453bff487a79d74 WatchSource:0}: Error finding container 87a97fee929040ffb58fc27f08454a1dc88d7ea11eea8eb28453bff487a79d74: Status 404 returned error can't find the container with id 87a97fee929040ffb58fc27f08454a1dc88d7ea11eea8eb28453bff487a79d74 Mar 19 12:30:31.481892 master-0 kubenswrapper[29612]: I0319 12:30:31.474029 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 19 12:30:31.504812 master-0 kubenswrapper[29612]: I0319 12:30:31.504532 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.504812 master-0 kubenswrapper[29612]: I0319 12:30:31.504574 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64fd\" (UniqueName: \"kubernetes.io/projected/f3d33dc3-9f8c-4950-98a3-532280705296-kube-api-access-c64fd\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.504812 master-0 kubenswrapper[29612]: I0319 12:30:31.504687 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-config-data\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.505119 master-0 kubenswrapper[29612]: I0319 12:30:31.504819 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-scripts\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.629481 master-0 kubenswrapper[29612]: I0319 12:30:31.623168 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-scripts\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.629481 master-0 kubenswrapper[29612]: I0319 12:30:31.623246 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.629481 master-0 kubenswrapper[29612]: I0319 12:30:31.623279 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c64fd\" (UniqueName: \"kubernetes.io/projected/f3d33dc3-9f8c-4950-98a3-532280705296-kube-api-access-c64fd\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.629481 master-0 kubenswrapper[29612]: I0319 12:30:31.623364 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-config-data\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.667685 master-0 kubenswrapper[29612]: I0319 12:30:31.665397 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-scripts\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.675695 master-0 kubenswrapper[29612]: I0319 12:30:31.673442 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-config-data\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.691690 master-0 kubenswrapper[29612]: I0319 12:30:31.676354 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.691690 master-0 kubenswrapper[29612]: I0319 12:30:31.677039 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:31.691690 master-0 kubenswrapper[29612]: I0319 12:30:31.689458 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64fd\" (UniqueName: \"kubernetes.io/projected/f3d33dc3-9f8c-4950-98a3-532280705296-kube-api-access-c64fd\") pod \"nova-cell1-conductor-db-sync-bb8d6\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.878059 master-0 kubenswrapper[29612]: I0319 12:30:31.850731 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:31.913742 master-0 kubenswrapper[29612]: I0319 12:30:31.913660 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-82rhf" event={"ID":"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba","Type":"ContainerStarted","Data":"fd6afa38d5c8ae84f5753bce341ccbe92ccfbc5eff76c0901c09849137965cfe"} Mar 19 12:30:31.913742 master-0 kubenswrapper[29612]: I0319 12:30:31.913720 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"46676b89-9cf1-4ceb-9feb-af82e8442ccb","Type":"ContainerStarted","Data":"87a97fee929040ffb58fc27f08454a1dc88d7ea11eea8eb28453bff487a79d74"} Mar 19 12:30:31.918333 master-0 kubenswrapper[29612]: I0319 12:30:31.918278 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b03fbb8c-07ca-4577-b654-25a4d25fab85","Type":"ContainerStarted","Data":"e74f59a077c406b8b44207f998c271129ec4790e51f1132fdf95c62e055f65b0"} Mar 19 12:30:32.002476 master-0 kubenswrapper[29612]: I0319 12:30:32.000491 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:30:32.200364 master-0 kubenswrapper[29612]: I0319 12:30:32.200310 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58cf5bfdcc-55c55"] Mar 19 12:30:32.373920 master-0 kubenswrapper[29612]: I0319 12:30:32.373855 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:32.392205 master-0 kubenswrapper[29612]: I0319 12:30:32.392162 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:30:32.395838 master-0 kubenswrapper[29612]: W0319 12:30:32.395226 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b676207_2926_4cb6_8f24_14859eee1d2f.slice/crio-ceecd53084343b4b82dc1f26d6c592bbdc27b30b0ffa93d66d252f9a0c3d7fe8 WatchSource:0}: Error finding container ceecd53084343b4b82dc1f26d6c592bbdc27b30b0ffa93d66d252f9a0c3d7fe8: Status 404 returned error can't find the container with id ceecd53084343b4b82dc1f26d6c592bbdc27b30b0ffa93d66d252f9a0c3d7fe8 Mar 19 12:30:32.398416 master-0 kubenswrapper[29612]: W0319 12:30:32.398356 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0982f57_e690_4df2_b3fe_23db650e0014.slice/crio-e4a8491e13534c7abdf9f7342db8e5ca751b272572e7c0c25799966e7f4eb3ed WatchSource:0}: Error finding container e4a8491e13534c7abdf9f7342db8e5ca751b272572e7c0c25799966e7f4eb3ed: Status 404 returned error can't find the container with id e4a8491e13534c7abdf9f7342db8e5ca751b272572e7c0c25799966e7f4eb3ed Mar 19 12:30:32.685326 master-0 kubenswrapper[29612]: I0319 12:30:32.684425 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bb8d6"] Mar 19 12:30:32.697775 master-0 kubenswrapper[29612]: W0319 12:30:32.697714 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3d33dc3_9f8c_4950_98a3_532280705296.slice/crio-a0a342fc3dd3693d736355da89628560684f31175747d0c2b8477d3353f9c04f WatchSource:0}: Error finding container a0a342fc3dd3693d736355da89628560684f31175747d0c2b8477d3353f9c04f: Status 404 returned error can't find the container with id a0a342fc3dd3693d736355da89628560684f31175747d0c2b8477d3353f9c04f Mar 19 12:30:32.982685 master-0 kubenswrapper[29612]: I0319 12:30:32.980807 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b676207-2926-4cb6-8f24-14859eee1d2f","Type":"ContainerStarted","Data":"ceecd53084343b4b82dc1f26d6c592bbdc27b30b0ffa93d66d252f9a0c3d7fe8"} Mar 19 12:30:32.988661 master-0 kubenswrapper[29612]: I0319 12:30:32.986206 29612 generic.go:334] "Generic (PLEG): container finished" podID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerID="9fc1ef0a5cb2601acf3fb337bbaa5ec7875121dd63ec86a1dc5678e0992fa751" exitCode=0 Mar 19 12:30:32.988661 master-0 kubenswrapper[29612]: I0319 12:30:32.986290 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" event={"ID":"a986f3f0-42d2-4d2b-a898-03c5b9f30c92","Type":"ContainerDied","Data":"9fc1ef0a5cb2601acf3fb337bbaa5ec7875121dd63ec86a1dc5678e0992fa751"} Mar 19 12:30:32.988661 master-0 kubenswrapper[29612]: I0319 12:30:32.986324 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" event={"ID":"a986f3f0-42d2-4d2b-a898-03c5b9f30c92","Type":"ContainerStarted","Data":"149d48310e87f471e2ddc91a31d86f96be0d49019ef581b091f349091dc5ab41"} Mar 19 12:30:32.996662 master-0 kubenswrapper[29612]: I0319 12:30:32.993356 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0982f57-e690-4df2-b3fe-23db650e0014","Type":"ContainerStarted","Data":"e4a8491e13534c7abdf9f7342db8e5ca751b272572e7c0c25799966e7f4eb3ed"} Mar 19 12:30:32.999693 master-0 kubenswrapper[29612]: I0319 12:30:32.998729 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df4b1340-dfba-4400-8dbe-452dc3de07c9","Type":"ContainerStarted","Data":"344496cc0876bee32e78de20cebcf99195e9d582588bf32a32218fa93483e024"} Mar 19 12:30:33.001984 master-0 kubenswrapper[29612]: I0319 12:30:33.001845 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-82rhf" event={"ID":"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba","Type":"ContainerStarted","Data":"8b6a726e51cb2d6fbcb7d9ab78407799f04167d090495c9fc9c5e36c32d0e732"} Mar 19 12:30:33.003818 master-0 kubenswrapper[29612]: I0319 12:30:33.003775 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" event={"ID":"f3d33dc3-9f8c-4950-98a3-532280705296","Type":"ContainerStarted","Data":"a0a342fc3dd3693d736355da89628560684f31175747d0c2b8477d3353f9c04f"} Mar 19 12:30:33.471683 master-0 kubenswrapper[29612]: I0319 12:30:33.471564 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-82rhf" podStartSLOduration=4.47154525 podStartE2EDuration="4.47154525s" podCreationTimestamp="2026-03-19 12:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:33.340378952 +0000 UTC m=+1240.654547993" watchObservedRunningTime="2026-03-19 12:30:33.47154525 +0000 UTC m=+1240.785714271" Mar 19 12:30:34.064665 master-0 kubenswrapper[29612]: I0319 12:30:34.056156 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" event={"ID":"f3d33dc3-9f8c-4950-98a3-532280705296","Type":"ContainerStarted","Data":"5fea02ff4fe0c8ee3f2cbe4c19c13b185f5919138864bb69844b4c446fac5556"} Mar 19 12:30:34.074912 master-0 kubenswrapper[29612]: I0319 12:30:34.072734 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" event={"ID":"a986f3f0-42d2-4d2b-a898-03c5b9f30c92","Type":"ContainerStarted","Data":"530b35a7818a248894bc17f8526f98e304a28482d0d84c9c4db2a2634e3cfc6b"} Mar 19 12:30:34.074912 master-0 kubenswrapper[29612]: I0319 12:30:34.072831 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:34.101231 master-0 kubenswrapper[29612]: I0319 12:30:34.101172 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:30:34.136785 master-0 kubenswrapper[29612]: I0319 12:30:34.136713 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:30:34.141409 master-0 kubenswrapper[29612]: I0319 12:30:34.141337 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" podStartSLOduration=3.141318781 podStartE2EDuration="3.141318781s" podCreationTimestamp="2026-03-19 12:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:34.093202244 +0000 UTC m=+1241.407371275" watchObservedRunningTime="2026-03-19 12:30:34.141318781 +0000 UTC m=+1241.455487802" Mar 19 12:30:34.147528 master-0 kubenswrapper[29612]: I0319 12:30:34.147139 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" podStartSLOduration=4.147118916 podStartE2EDuration="4.147118916s" podCreationTimestamp="2026-03-19 12:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:34.117260918 +0000 UTC m=+1241.431429939" watchObservedRunningTime="2026-03-19 12:30:34.147118916 +0000 UTC m=+1241.461287947" Mar 19 12:30:37.144572 master-0 kubenswrapper[29612]: I0319 12:30:37.144510 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0982f57-e690-4df2-b3fe-23db650e0014","Type":"ContainerStarted","Data":"0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0"} Mar 19 12:30:37.157860 master-0 kubenswrapper[29612]: I0319 12:30:37.157787 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b03fbb8c-07ca-4577-b654-25a4d25fab85","Type":"ContainerStarted","Data":"499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a"} Mar 19 12:30:37.158017 master-0 kubenswrapper[29612]: I0319 12:30:37.157864 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b03fbb8c-07ca-4577-b654-25a4d25fab85","Type":"ContainerStarted","Data":"c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe"} Mar 19 12:30:37.176368 master-0 kubenswrapper[29612]: I0319 12:30:37.176297 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df4b1340-dfba-4400-8dbe-452dc3de07c9","Type":"ContainerStarted","Data":"2f7337e851369b1c5f42d5d6eb94e324968de5af0f17226f45b89236d33a1bf0"} Mar 19 12:30:37.176368 master-0 kubenswrapper[29612]: I0319 12:30:37.176367 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df4b1340-dfba-4400-8dbe-452dc3de07c9","Type":"ContainerStarted","Data":"05a46a57e057a1a613f08be378ad64d6ca4bc9a2a6c8882b1cd77d4c3a46651d"} Mar 19 12:30:37.176628 master-0 kubenswrapper[29612]: I0319 12:30:37.176528 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-log" containerID="cri-o://05a46a57e057a1a613f08be378ad64d6ca4bc9a2a6c8882b1cd77d4c3a46651d" gracePeriod=30 Mar 19 12:30:37.176756 master-0 kubenswrapper[29612]: I0319 12:30:37.176704 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-metadata" containerID="cri-o://2f7337e851369b1c5f42d5d6eb94e324968de5af0f17226f45b89236d33a1bf0" gracePeriod=30 Mar 19 12:30:37.180416 master-0 kubenswrapper[29612]: I0319 12:30:37.180237 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b676207-2926-4cb6-8f24-14859eee1d2f","Type":"ContainerStarted","Data":"72547e021b6a18944aa444e1fc1d70a6067ef6defd511fcb32e780050b71ce9a"} Mar 19 12:30:37.180416 master-0 kubenswrapper[29612]: I0319 12:30:37.180378 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8b676207-2926-4cb6-8f24-14859eee1d2f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://72547e021b6a18944aa444e1fc1d70a6067ef6defd511fcb32e780050b71ce9a" gracePeriod=30 Mar 19 12:30:37.539093 master-0 kubenswrapper[29612]: I0319 12:30:37.533560 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.786677154 podStartE2EDuration="7.533541927s" podCreationTimestamp="2026-03-19 12:30:30 +0000 UTC" firstStartedPulling="2026-03-19 12:30:32.40070817 +0000 UTC m=+1239.714877191" lastFinishedPulling="2026-03-19 12:30:36.147572943 +0000 UTC m=+1243.461741964" observedRunningTime="2026-03-19 12:30:37.518758527 +0000 UTC m=+1244.832927568" watchObservedRunningTime="2026-03-19 12:30:37.533541927 +0000 UTC m=+1244.847710948" Mar 19 12:30:37.556715 master-0 kubenswrapper[29612]: I0319 12:30:37.556626 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.405476541 podStartE2EDuration="7.556606642s" podCreationTimestamp="2026-03-19 12:30:30 +0000 UTC" firstStartedPulling="2026-03-19 12:30:31.993831128 +0000 UTC m=+1239.308000149" lastFinishedPulling="2026-03-19 12:30:36.144961229 +0000 UTC m=+1243.459130250" observedRunningTime="2026-03-19 12:30:37.540388721 +0000 UTC m=+1244.854557762" watchObservedRunningTime="2026-03-19 12:30:37.556606642 +0000 UTC m=+1244.870775663" Mar 19 12:30:37.593298 master-0 kubenswrapper[29612]: I0319 12:30:37.593112 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.842262324 podStartE2EDuration="7.593089619s" podCreationTimestamp="2026-03-19 12:30:30 +0000 UTC" firstStartedPulling="2026-03-19 12:30:32.398552659 +0000 UTC m=+1239.712721680" lastFinishedPulling="2026-03-19 12:30:36.149379954 +0000 UTC m=+1243.463548975" observedRunningTime="2026-03-19 12:30:37.563886239 +0000 UTC m=+1244.878055260" watchObservedRunningTime="2026-03-19 12:30:37.593089619 +0000 UTC m=+1244.907258640" Mar 19 12:30:38.201184 master-0 kubenswrapper[29612]: I0319 12:30:38.201119 29612 generic.go:334] "Generic (PLEG): container finished" podID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerID="05a46a57e057a1a613f08be378ad64d6ca4bc9a2a6c8882b1cd77d4c3a46651d" exitCode=143 Mar 19 12:30:38.202344 master-0 kubenswrapper[29612]: I0319 12:30:38.201200 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df4b1340-dfba-4400-8dbe-452dc3de07c9","Type":"ContainerDied","Data":"05a46a57e057a1a613f08be378ad64d6ca4bc9a2a6c8882b1cd77d4c3a46651d"} Mar 19 12:30:40.241973 master-0 kubenswrapper[29612]: I0319 12:30:40.241915 29612 generic.go:334] "Generic (PLEG): container finished" podID="5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" containerID="8b6a726e51cb2d6fbcb7d9ab78407799f04167d090495c9fc9c5e36c32d0e732" exitCode=0 Mar 19 12:30:40.242850 master-0 kubenswrapper[29612]: I0319 12:30:40.241979 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-82rhf" event={"ID":"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba","Type":"ContainerDied","Data":"8b6a726e51cb2d6fbcb7d9ab78407799f04167d090495c9fc9c5e36c32d0e732"} Mar 19 12:30:40.266384 master-0 kubenswrapper[29612]: I0319 12:30:40.266303 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=5.865468447 podStartE2EDuration="10.266285873s" podCreationTimestamp="2026-03-19 12:30:30 +0000 UTC" firstStartedPulling="2026-03-19 12:30:31.746793298 +0000 UTC m=+1239.060962319" lastFinishedPulling="2026-03-19 12:30:36.147610724 +0000 UTC m=+1243.461779745" observedRunningTime="2026-03-19 12:30:38.231289845 +0000 UTC m=+1245.545458876" watchObservedRunningTime="2026-03-19 12:30:40.266285873 +0000 UTC m=+1247.580454894" Mar 19 12:30:40.992542 master-0 kubenswrapper[29612]: I0319 12:30:40.992477 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 12:30:40.992542 master-0 kubenswrapper[29612]: I0319 12:30:40.992551 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 12:30:41.046784 master-0 kubenswrapper[29612]: I0319 12:30:41.046707 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 12:30:41.046784 master-0 kubenswrapper[29612]: I0319 12:30:41.046786 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 12:30:41.087194 master-0 kubenswrapper[29612]: I0319 12:30:41.087143 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 12:30:41.109857 master-0 kubenswrapper[29612]: I0319 12:30:41.109424 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:30:41.112076 master-0 kubenswrapper[29612]: I0319 12:30:41.111758 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:30:41.210439 master-0 kubenswrapper[29612]: I0319 12:30:41.209535 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8d764b8bf-7q659"] Mar 19 12:30:41.210439 master-0 kubenswrapper[29612]: I0319 12:30:41.209792 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="dnsmasq-dns" containerID="cri-o://9d5de17295c12332ab0aa0daf6e9af45e52296f6d8f6634d6a5e84c6f055b84b" gracePeriod=10 Mar 19 12:30:41.431053 master-0 kubenswrapper[29612]: I0319 12:30:41.430923 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 12:30:42.072833 master-0 kubenswrapper[29612]: I0319 12:30:42.072779 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 12:30:42.073041 master-0 kubenswrapper[29612]: I0319 12:30:42.072824 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 12:30:45.942702 master-0 kubenswrapper[29612]: I0319 12:30:45.942401 29612 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.251:5353: connect: connection refused" Mar 19 12:30:47.131222 master-0 kubenswrapper[29612]: I0319 12:30:47.131052 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:47.203253 master-0 kubenswrapper[29612]: I0319 12:30:47.201341 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-config-data\") pod \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " Mar 19 12:30:47.203253 master-0 kubenswrapper[29612]: I0319 12:30:47.201423 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-scripts\") pod \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " Mar 19 12:30:47.203253 master-0 kubenswrapper[29612]: I0319 12:30:47.201603 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kwr7\" (UniqueName: \"kubernetes.io/projected/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-kube-api-access-9kwr7\") pod \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " Mar 19 12:30:47.203253 master-0 kubenswrapper[29612]: I0319 12:30:47.201810 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-combined-ca-bundle\") pod \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\" (UID: \"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba\") " Mar 19 12:30:47.216908 master-0 kubenswrapper[29612]: I0319 12:30:47.216824 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-kube-api-access-9kwr7" (OuterVolumeSpecName: "kube-api-access-9kwr7") pod "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" (UID: "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba"). InnerVolumeSpecName "kube-api-access-9kwr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:30:47.219223 master-0 kubenswrapper[29612]: I0319 12:30:47.219155 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-scripts" (OuterVolumeSpecName: "scripts") pod "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" (UID: "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:47.278741 master-0 kubenswrapper[29612]: I0319 12:30:47.278624 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-config-data" (OuterVolumeSpecName: "config-data") pod "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" (UID: "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:47.281262 master-0 kubenswrapper[29612]: I0319 12:30:47.281206 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" (UID: "5cb6a533-34b0-46dc-bf9e-d612ac6bfdba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:47.308102 master-0 kubenswrapper[29612]: I0319 12:30:47.308023 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kwr7\" (UniqueName: \"kubernetes.io/projected/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-kube-api-access-9kwr7\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.308102 master-0 kubenswrapper[29612]: I0319 12:30:47.308089 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.308102 master-0 kubenswrapper[29612]: I0319 12:30:47.308103 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.308102 master-0 kubenswrapper[29612]: I0319 12:30:47.308114 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.381971 master-0 kubenswrapper[29612]: I0319 12:30:47.381836 29612 generic.go:334] "Generic (PLEG): container finished" podID="f3d33dc3-9f8c-4950-98a3-532280705296" containerID="5fea02ff4fe0c8ee3f2cbe4c19c13b185f5919138864bb69844b4c446fac5556" exitCode=0 Mar 19 12:30:47.381971 master-0 kubenswrapper[29612]: I0319 12:30:47.381924 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" event={"ID":"f3d33dc3-9f8c-4950-98a3-532280705296","Type":"ContainerDied","Data":"5fea02ff4fe0c8ee3f2cbe4c19c13b185f5919138864bb69844b4c446fac5556"} Mar 19 12:30:47.386673 master-0 kubenswrapper[29612]: I0319 12:30:47.386595 29612 generic.go:334] "Generic (PLEG): container finished" podID="8ed03592-eec9-4173-8239-217452bc4579" containerID="b2fbcff5f2cfc6cb1a5b6a95976e318aff150d717c269f2c81d2a9490c7e933f" exitCode=0 Mar 19 12:30:47.386946 master-0 kubenswrapper[29612]: I0319 12:30:47.386716 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerDied","Data":"b2fbcff5f2cfc6cb1a5b6a95976e318aff150d717c269f2c81d2a9490c7e933f"} Mar 19 12:30:47.389093 master-0 kubenswrapper[29612]: I0319 12:30:47.389051 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"46676b89-9cf1-4ceb-9feb-af82e8442ccb","Type":"ContainerStarted","Data":"0996d41b7edc804b1ccc0c656dbf35cd9fcb8f2770c77fa89fe3256aef536cd0"} Mar 19 12:30:47.389283 master-0 kubenswrapper[29612]: I0319 12:30:47.389248 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:47.392062 master-0 kubenswrapper[29612]: I0319 12:30:47.391924 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-82rhf" event={"ID":"5cb6a533-34b0-46dc-bf9e-d612ac6bfdba","Type":"ContainerDied","Data":"fd6afa38d5c8ae84f5753bce341ccbe92ccfbc5eff76c0901c09849137965cfe"} Mar 19 12:30:47.392062 master-0 kubenswrapper[29612]: I0319 12:30:47.391952 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd6afa38d5c8ae84f5753bce341ccbe92ccfbc5eff76c0901c09849137965cfe" Mar 19 12:30:47.392062 master-0 kubenswrapper[29612]: I0319 12:30:47.392020 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-82rhf" Mar 19 12:30:47.403106 master-0 kubenswrapper[29612]: I0319 12:30:47.403002 29612 generic.go:334] "Generic (PLEG): container finished" podID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerID="9d5de17295c12332ab0aa0daf6e9af45e52296f6d8f6634d6a5e84c6f055b84b" exitCode=0 Mar 19 12:30:47.403106 master-0 kubenswrapper[29612]: I0319 12:30:47.403063 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" event={"ID":"9ed9991a-9940-48ce-bae1-1e8c416ee8f9","Type":"ContainerDied","Data":"9d5de17295c12332ab0aa0daf6e9af45e52296f6d8f6634d6a5e84c6f055b84b"} Mar 19 12:30:47.432344 master-0 kubenswrapper[29612]: I0319 12:30:47.431564 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:30:47.465144 master-0 kubenswrapper[29612]: I0319 12:30:47.465025 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 19 12:30:47.479217 master-0 kubenswrapper[29612]: I0319 12:30:47.479126 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=2.011789257 podStartE2EDuration="17.479101156s" podCreationTimestamp="2026-03-19 12:30:30 +0000 UTC" firstStartedPulling="2026-03-19 12:30:31.521000081 +0000 UTC m=+1238.835169112" lastFinishedPulling="2026-03-19 12:30:46.98831199 +0000 UTC m=+1254.302481011" observedRunningTime="2026-03-19 12:30:47.466971842 +0000 UTC m=+1254.781140863" watchObservedRunningTime="2026-03-19 12:30:47.479101156 +0000 UTC m=+1254.793270187" Mar 19 12:30:47.511620 master-0 kubenswrapper[29612]: I0319 12:30:47.511543 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-config\") pod \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " Mar 19 12:30:47.511620 master-0 kubenswrapper[29612]: I0319 12:30:47.511865 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvjmf\" (UniqueName: \"kubernetes.io/projected/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-kube-api-access-cvjmf\") pod \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " Mar 19 12:30:47.511620 master-0 kubenswrapper[29612]: I0319 12:30:47.511935 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-svc\") pod \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " Mar 19 12:30:47.512359 master-0 kubenswrapper[29612]: I0319 12:30:47.512191 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-nb\") pod \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " Mar 19 12:30:47.512359 master-0 kubenswrapper[29612]: I0319 12:30:47.512267 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-swift-storage-0\") pod \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " Mar 19 12:30:47.512432 master-0 kubenswrapper[29612]: I0319 12:30:47.512359 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-sb\") pod \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\" (UID: \"9ed9991a-9940-48ce-bae1-1e8c416ee8f9\") " Mar 19 12:30:47.515485 master-0 kubenswrapper[29612]: I0319 12:30:47.515445 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-kube-api-access-cvjmf" (OuterVolumeSpecName: "kube-api-access-cvjmf") pod "9ed9991a-9940-48ce-bae1-1e8c416ee8f9" (UID: "9ed9991a-9940-48ce-bae1-1e8c416ee8f9"). InnerVolumeSpecName "kube-api-access-cvjmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:30:47.599596 master-0 kubenswrapper[29612]: I0319 12:30:47.599514 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-config" (OuterVolumeSpecName: "config") pod "9ed9991a-9940-48ce-bae1-1e8c416ee8f9" (UID: "9ed9991a-9940-48ce-bae1-1e8c416ee8f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:30:47.619702 master-0 kubenswrapper[29612]: I0319 12:30:47.619617 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.619702 master-0 kubenswrapper[29612]: I0319 12:30:47.619695 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvjmf\" (UniqueName: \"kubernetes.io/projected/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-kube-api-access-cvjmf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.629616 master-0 kubenswrapper[29612]: I0319 12:30:47.629479 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ed9991a-9940-48ce-bae1-1e8c416ee8f9" (UID: "9ed9991a-9940-48ce-bae1-1e8c416ee8f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:30:47.640153 master-0 kubenswrapper[29612]: I0319 12:30:47.639980 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ed9991a-9940-48ce-bae1-1e8c416ee8f9" (UID: "9ed9991a-9940-48ce-bae1-1e8c416ee8f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:30:47.641947 master-0 kubenswrapper[29612]: I0319 12:30:47.641880 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ed9991a-9940-48ce-bae1-1e8c416ee8f9" (UID: "9ed9991a-9940-48ce-bae1-1e8c416ee8f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:30:47.668531 master-0 kubenswrapper[29612]: I0319 12:30:47.668461 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ed9991a-9940-48ce-bae1-1e8c416ee8f9" (UID: "9ed9991a-9940-48ce-bae1-1e8c416ee8f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:30:47.721555 master-0 kubenswrapper[29612]: I0319 12:30:47.721498 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.721555 master-0 kubenswrapper[29612]: I0319 12:30:47.721534 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.721555 master-0 kubenswrapper[29612]: I0319 12:30:47.721545 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:47.721555 master-0 kubenswrapper[29612]: I0319 12:30:47.721555 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ed9991a-9940-48ce-bae1-1e8c416ee8f9-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:48.329783 master-0 kubenswrapper[29612]: I0319 12:30:48.329571 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:48.330270 master-0 kubenswrapper[29612]: I0319 12:30:48.329895 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-log" containerID="cri-o://c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe" gracePeriod=30 Mar 19 12:30:48.330270 master-0 kubenswrapper[29612]: I0319 12:30:48.329971 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-api" containerID="cri-o://499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a" gracePeriod=30 Mar 19 12:30:48.350041 master-0 kubenswrapper[29612]: I0319 12:30:48.349992 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:48.350561 master-0 kubenswrapper[29612]: I0319 12:30:48.350532 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f0982f57-e690-4df2-b3fe-23db650e0014" containerName="nova-scheduler-scheduler" containerID="cri-o://0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0" gracePeriod=30 Mar 19 12:30:48.435371 master-0 kubenswrapper[29612]: I0319 12:30:48.435305 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"721dc010d7d6822aec8d186b644473c1bf875e76a9c5202ae0ae34f0240f7897"} Mar 19 12:30:48.443978 master-0 kubenswrapper[29612]: I0319 12:30:48.443903 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" event={"ID":"9ed9991a-9940-48ce-bae1-1e8c416ee8f9","Type":"ContainerDied","Data":"8cfdab9fa6993646dd7110b9e5552f1f25e21e32411cfe2f139a2bf4329a1132"} Mar 19 12:30:48.444216 master-0 kubenswrapper[29612]: I0319 12:30:48.443993 29612 scope.go:117] "RemoveContainer" containerID="9d5de17295c12332ab0aa0daf6e9af45e52296f6d8f6634d6a5e84c6f055b84b" Mar 19 12:30:48.444216 master-0 kubenswrapper[29612]: I0319 12:30:48.444073 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d764b8bf-7q659" Mar 19 12:30:48.498586 master-0 kubenswrapper[29612]: I0319 12:30:48.493218 29612 scope.go:117] "RemoveContainer" containerID="98b53e175a3e8f0139f61cbc1baebb75af664b329568d876bcce9bb4b8886e6b" Mar 19 12:30:48.532895 master-0 kubenswrapper[29612]: I0319 12:30:48.530689 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8d764b8bf-7q659"] Mar 19 12:30:48.574535 master-0 kubenswrapper[29612]: I0319 12:30:48.574482 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8d764b8bf-7q659"] Mar 19 12:30:48.951045 master-0 kubenswrapper[29612]: I0319 12:30:48.948842 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" path="/var/lib/kubelet/pods/9ed9991a-9940-48ce-bae1-1e8c416ee8f9/volumes" Mar 19 12:30:48.970687 master-0 kubenswrapper[29612]: I0319 12:30:48.967798 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:49.006656 master-0 kubenswrapper[29612]: I0319 12:30:48.998619 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 12:30:49.006656 master-0 kubenswrapper[29612]: I0319 12:30:48.998695 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 12:30:49.030658 master-0 kubenswrapper[29612]: I0319 12:30:49.029814 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 12:30:49.030658 master-0 kubenswrapper[29612]: I0319 12:30:49.029896 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 12:30:49.076697 master-0 kubenswrapper[29612]: I0319 12:30:49.076159 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-config-data\") pod \"f3d33dc3-9f8c-4950-98a3-532280705296\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " Mar 19 12:30:49.076697 master-0 kubenswrapper[29612]: I0319 12:30:49.076321 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-scripts\") pod \"f3d33dc3-9f8c-4950-98a3-532280705296\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " Mar 19 12:30:49.076697 master-0 kubenswrapper[29612]: I0319 12:30:49.076361 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-combined-ca-bundle\") pod \"f3d33dc3-9f8c-4950-98a3-532280705296\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " Mar 19 12:30:49.076697 master-0 kubenswrapper[29612]: I0319 12:30:49.076468 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c64fd\" (UniqueName: \"kubernetes.io/projected/f3d33dc3-9f8c-4950-98a3-532280705296-kube-api-access-c64fd\") pod \"f3d33dc3-9f8c-4950-98a3-532280705296\" (UID: \"f3d33dc3-9f8c-4950-98a3-532280705296\") " Mar 19 12:30:49.080660 master-0 kubenswrapper[29612]: I0319 12:30:49.079614 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d33dc3-9f8c-4950-98a3-532280705296-kube-api-access-c64fd" (OuterVolumeSpecName: "kube-api-access-c64fd") pod "f3d33dc3-9f8c-4950-98a3-532280705296" (UID: "f3d33dc3-9f8c-4950-98a3-532280705296"). InnerVolumeSpecName "kube-api-access-c64fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:30:49.080660 master-0 kubenswrapper[29612]: I0319 12:30:49.080108 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-scripts" (OuterVolumeSpecName: "scripts") pod "f3d33dc3-9f8c-4950-98a3-532280705296" (UID: "f3d33dc3-9f8c-4950-98a3-532280705296"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:49.114663 master-0 kubenswrapper[29612]: I0319 12:30:49.112294 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3d33dc3-9f8c-4950-98a3-532280705296" (UID: "f3d33dc3-9f8c-4950-98a3-532280705296"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:49.114663 master-0 kubenswrapper[29612]: I0319 12:30:49.114462 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-config-data" (OuterVolumeSpecName: "config-data") pod "f3d33dc3-9f8c-4950-98a3-532280705296" (UID: "f3d33dc3-9f8c-4950-98a3-532280705296"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:49.180738 master-0 kubenswrapper[29612]: I0319 12:30:49.180117 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:49.180738 master-0 kubenswrapper[29612]: I0319 12:30:49.180183 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:49.180738 master-0 kubenswrapper[29612]: I0319 12:30:49.180198 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c64fd\" (UniqueName: \"kubernetes.io/projected/f3d33dc3-9f8c-4950-98a3-532280705296-kube-api-access-c64fd\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:49.180738 master-0 kubenswrapper[29612]: I0319 12:30:49.180207 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d33dc3-9f8c-4950-98a3-532280705296-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:49.460802 master-0 kubenswrapper[29612]: I0319 12:30:49.460672 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" Mar 19 12:30:49.461333 master-0 kubenswrapper[29612]: I0319 12:30:49.460680 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bb8d6" event={"ID":"f3d33dc3-9f8c-4950-98a3-532280705296","Type":"ContainerDied","Data":"a0a342fc3dd3693d736355da89628560684f31175747d0c2b8477d3353f9c04f"} Mar 19 12:30:49.461333 master-0 kubenswrapper[29612]: I0319 12:30:49.460876 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0a342fc3dd3693d736355da89628560684f31175747d0c2b8477d3353f9c04f" Mar 19 12:30:49.466479 master-0 kubenswrapper[29612]: I0319 12:30:49.466421 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"dc292acfbbd997bd4ef5f999f5f40d336dd508637be9770a5dd0568c3ad7e7dd"} Mar 19 12:30:49.466479 master-0 kubenswrapper[29612]: I0319 12:30:49.466475 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"8ed03592-eec9-4173-8239-217452bc4579","Type":"ContainerStarted","Data":"ec708e46b4872d308f7bba083365fe5eb6dcf184e876f1e7907cbd6463f1e361"} Mar 19 12:30:49.466716 master-0 kubenswrapper[29612]: I0319 12:30:49.466681 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 19 12:30:49.466777 master-0 kubenswrapper[29612]: I0319 12:30:49.466725 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 19 12:30:49.470356 master-0 kubenswrapper[29612]: I0319 12:30:49.470306 29612 generic.go:334] "Generic (PLEG): container finished" podID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerID="c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe" exitCode=143 Mar 19 12:30:49.471182 master-0 kubenswrapper[29612]: I0319 12:30:49.471118 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b03fbb8c-07ca-4577-b654-25a4d25fab85","Type":"ContainerDied","Data":"c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe"} Mar 19 12:30:49.522665 master-0 kubenswrapper[29612]: I0319 12:30:49.520600 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=65.208606009 podStartE2EDuration="1m51.520581398s" podCreationTimestamp="2026-03-19 12:28:58 +0000 UTC" firstStartedPulling="2026-03-19 12:29:10.227539458 +0000 UTC m=+1157.541708479" lastFinishedPulling="2026-03-19 12:29:56.539514847 +0000 UTC m=+1203.853683868" observedRunningTime="2026-03-19 12:30:49.506515588 +0000 UTC m=+1256.820684619" watchObservedRunningTime="2026-03-19 12:30:49.520581398 +0000 UTC m=+1256.834750419" Mar 19 12:30:49.586127 master-0 kubenswrapper[29612]: I0319 12:30:49.585984 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 12:30:49.586756 master-0 kubenswrapper[29612]: E0319 12:30:49.586733 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" containerName="nova-manage" Mar 19 12:30:49.586826 master-0 kubenswrapper[29612]: I0319 12:30:49.586765 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" containerName="nova-manage" Mar 19 12:30:49.586858 master-0 kubenswrapper[29612]: E0319 12:30:49.586831 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="init" Mar 19 12:30:49.586858 master-0 kubenswrapper[29612]: I0319 12:30:49.586838 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="init" Mar 19 12:30:49.586858 master-0 kubenswrapper[29612]: E0319 12:30:49.586849 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="dnsmasq-dns" Mar 19 12:30:49.586950 master-0 kubenswrapper[29612]: I0319 12:30:49.586858 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="dnsmasq-dns" Mar 19 12:30:49.586950 master-0 kubenswrapper[29612]: E0319 12:30:49.586874 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d33dc3-9f8c-4950-98a3-532280705296" containerName="nova-cell1-conductor-db-sync" Mar 19 12:30:49.586950 master-0 kubenswrapper[29612]: I0319 12:30:49.586880 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d33dc3-9f8c-4950-98a3-532280705296" containerName="nova-cell1-conductor-db-sync" Mar 19 12:30:49.587139 master-0 kubenswrapper[29612]: I0319 12:30:49.587122 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" containerName="nova-manage" Mar 19 12:30:49.587188 master-0 kubenswrapper[29612]: I0319 12:30:49.587149 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d33dc3-9f8c-4950-98a3-532280705296" containerName="nova-cell1-conductor-db-sync" Mar 19 12:30:49.587188 master-0 kubenswrapper[29612]: I0319 12:30:49.587169 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed9991a-9940-48ce-bae1-1e8c416ee8f9" containerName="dnsmasq-dns" Mar 19 12:30:49.588097 master-0 kubenswrapper[29612]: I0319 12:30:49.588073 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.591468 master-0 kubenswrapper[29612]: I0319 12:30:49.591431 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 12:30:49.609658 master-0 kubenswrapper[29612]: I0319 12:30:49.609383 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 12:30:49.794607 master-0 kubenswrapper[29612]: I0319 12:30:49.794451 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cf591a-eb78-4fb2-ab58-59a765811789-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.794808 master-0 kubenswrapper[29612]: I0319 12:30:49.794671 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4cf9\" (UniqueName: \"kubernetes.io/projected/23cf591a-eb78-4fb2-ab58-59a765811789-kube-api-access-g4cf9\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.794808 master-0 kubenswrapper[29612]: I0319 12:30:49.794789 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cf591a-eb78-4fb2-ab58-59a765811789-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.896952 master-0 kubenswrapper[29612]: I0319 12:30:49.896882 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cf591a-eb78-4fb2-ab58-59a765811789-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.897162 master-0 kubenswrapper[29612]: I0319 12:30:49.897027 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4cf9\" (UniqueName: \"kubernetes.io/projected/23cf591a-eb78-4fb2-ab58-59a765811789-kube-api-access-g4cf9\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.897277 master-0 kubenswrapper[29612]: I0319 12:30:49.897245 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cf591a-eb78-4fb2-ab58-59a765811789-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.905669 master-0 kubenswrapper[29612]: I0319 12:30:49.905205 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23cf591a-eb78-4fb2-ab58-59a765811789-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.910667 master-0 kubenswrapper[29612]: I0319 12:30:49.909226 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23cf591a-eb78-4fb2-ab58-59a765811789-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:49.930659 master-0 kubenswrapper[29612]: I0319 12:30:49.930319 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4cf9\" (UniqueName: \"kubernetes.io/projected/23cf591a-eb78-4fb2-ab58-59a765811789-kube-api-access-g4cf9\") pod \"nova-cell1-conductor-0\" (UID: \"23cf591a-eb78-4fb2-ab58-59a765811789\") " pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:50.215016 master-0 kubenswrapper[29612]: I0319 12:30:50.214942 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:50.751371 master-0 kubenswrapper[29612]: I0319 12:30:50.751306 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 12:30:50.755062 master-0 kubenswrapper[29612]: W0319 12:30:50.754673 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23cf591a_eb78_4fb2_ab58_59a765811789.slice/crio-de2dcb99e6ce9e57354059d43f60728e929a819cc2703916c9d778440d89fc02 WatchSource:0}: Error finding container de2dcb99e6ce9e57354059d43f60728e929a819cc2703916c9d778440d89fc02: Status 404 returned error can't find the container with id de2dcb99e6ce9e57354059d43f60728e929a819cc2703916c9d778440d89fc02 Mar 19 12:30:50.940765 master-0 kubenswrapper[29612]: I0319 12:30:50.940576 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 19 12:30:51.048836 master-0 kubenswrapper[29612]: E0319 12:30:51.048655 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 12:30:51.050362 master-0 kubenswrapper[29612]: E0319 12:30:51.050299 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 12:30:51.051975 master-0 kubenswrapper[29612]: E0319 12:30:51.051894 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 12:30:51.052086 master-0 kubenswrapper[29612]: E0319 12:30:51.051994 29612 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f0982f57-e690-4df2-b3fe-23db650e0014" containerName="nova-scheduler-scheduler" Mar 19 12:30:51.503610 master-0 kubenswrapper[29612]: I0319 12:30:51.503531 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23cf591a-eb78-4fb2-ab58-59a765811789","Type":"ContainerStarted","Data":"5eedc9ad2632c2ad762db8c2120718e98d317dd766953020240c32591d90d463"} Mar 19 12:30:51.503835 master-0 kubenswrapper[29612]: I0319 12:30:51.503618 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23cf591a-eb78-4fb2-ab58-59a765811789","Type":"ContainerStarted","Data":"de2dcb99e6ce9e57354059d43f60728e929a819cc2703916c9d778440d89fc02"} Mar 19 12:30:51.504820 master-0 kubenswrapper[29612]: I0319 12:30:51.504774 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:52.055035 master-0 kubenswrapper[29612]: I0319 12:30:52.054958 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:30:52.086510 master-0 kubenswrapper[29612]: I0319 12:30:52.086425 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.08640814 podStartE2EDuration="3.08640814s" podCreationTimestamp="2026-03-19 12:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:51.531371248 +0000 UTC m=+1258.845540279" watchObservedRunningTime="2026-03-19 12:30:52.08640814 +0000 UTC m=+1259.400577151" Mar 19 12:30:52.158682 master-0 kubenswrapper[29612]: I0319 12:30:52.158583 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wwcp\" (UniqueName: \"kubernetes.io/projected/b03fbb8c-07ca-4577-b654-25a4d25fab85-kube-api-access-6wwcp\") pod \"b03fbb8c-07ca-4577-b654-25a4d25fab85\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " Mar 19 12:30:52.158922 master-0 kubenswrapper[29612]: I0319 12:30:52.158735 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03fbb8c-07ca-4577-b654-25a4d25fab85-logs\") pod \"b03fbb8c-07ca-4577-b654-25a4d25fab85\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " Mar 19 12:30:52.159077 master-0 kubenswrapper[29612]: I0319 12:30:52.158923 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-config-data\") pod \"b03fbb8c-07ca-4577-b654-25a4d25fab85\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " Mar 19 12:30:52.159077 master-0 kubenswrapper[29612]: I0319 12:30:52.159016 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-combined-ca-bundle\") pod \"b03fbb8c-07ca-4577-b654-25a4d25fab85\" (UID: \"b03fbb8c-07ca-4577-b654-25a4d25fab85\") " Mar 19 12:30:52.159376 master-0 kubenswrapper[29612]: I0319 12:30:52.159321 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03fbb8c-07ca-4577-b654-25a4d25fab85-logs" (OuterVolumeSpecName: "logs") pod "b03fbb8c-07ca-4577-b654-25a4d25fab85" (UID: "b03fbb8c-07ca-4577-b654-25a4d25fab85"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:30:52.160069 master-0 kubenswrapper[29612]: I0319 12:30:52.160026 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03fbb8c-07ca-4577-b654-25a4d25fab85-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:52.170741 master-0 kubenswrapper[29612]: I0319 12:30:52.162315 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03fbb8c-07ca-4577-b654-25a4d25fab85-kube-api-access-6wwcp" (OuterVolumeSpecName: "kube-api-access-6wwcp") pod "b03fbb8c-07ca-4577-b654-25a4d25fab85" (UID: "b03fbb8c-07ca-4577-b654-25a4d25fab85"). InnerVolumeSpecName "kube-api-access-6wwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:30:52.189721 master-0 kubenswrapper[29612]: I0319 12:30:52.189548 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-config-data" (OuterVolumeSpecName: "config-data") pod "b03fbb8c-07ca-4577-b654-25a4d25fab85" (UID: "b03fbb8c-07ca-4577-b654-25a4d25fab85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:52.208331 master-0 kubenswrapper[29612]: I0319 12:30:52.208226 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b03fbb8c-07ca-4577-b654-25a4d25fab85" (UID: "b03fbb8c-07ca-4577-b654-25a4d25fab85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:52.262351 master-0 kubenswrapper[29612]: I0319 12:30:52.262276 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wwcp\" (UniqueName: \"kubernetes.io/projected/b03fbb8c-07ca-4577-b654-25a4d25fab85-kube-api-access-6wwcp\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:52.262351 master-0 kubenswrapper[29612]: I0319 12:30:52.262321 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:52.262351 master-0 kubenswrapper[29612]: I0319 12:30:52.262330 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03fbb8c-07ca-4577-b654-25a4d25fab85-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:52.378588 master-0 kubenswrapper[29612]: I0319 12:30:52.378531 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 19 12:30:52.517406 master-0 kubenswrapper[29612]: I0319 12:30:52.517298 29612 generic.go:334] "Generic (PLEG): container finished" podID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerID="499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a" exitCode=0 Mar 19 12:30:52.517643 master-0 kubenswrapper[29612]: I0319 12:30:52.517368 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:30:52.517817 master-0 kubenswrapper[29612]: I0319 12:30:52.517402 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b03fbb8c-07ca-4577-b654-25a4d25fab85","Type":"ContainerDied","Data":"499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a"} Mar 19 12:30:52.517905 master-0 kubenswrapper[29612]: I0319 12:30:52.517866 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b03fbb8c-07ca-4577-b654-25a4d25fab85","Type":"ContainerDied","Data":"e74f59a077c406b8b44207f998c271129ec4790e51f1132fdf95c62e055f65b0"} Mar 19 12:30:52.517945 master-0 kubenswrapper[29612]: I0319 12:30:52.517915 29612 scope.go:117] "RemoveContainer" containerID="499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a" Mar 19 12:30:52.551065 master-0 kubenswrapper[29612]: I0319 12:30:52.550891 29612 scope.go:117] "RemoveContainer" containerID="c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe" Mar 19 12:30:52.553105 master-0 kubenswrapper[29612]: I0319 12:30:52.553068 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 19 12:30:52.591905 master-0 kubenswrapper[29612]: I0319 12:30:52.589033 29612 scope.go:117] "RemoveContainer" containerID="499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a" Mar 19 12:30:52.591905 master-0 kubenswrapper[29612]: E0319 12:30:52.589461 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a\": container with ID starting with 499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a not found: ID does not exist" containerID="499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a" Mar 19 12:30:52.591905 master-0 kubenswrapper[29612]: I0319 12:30:52.589502 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a"} err="failed to get container status \"499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a\": rpc error: code = NotFound desc = could not find container \"499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a\": container with ID starting with 499e1cd1190188cc51acb2d2ca4611408b7ca88bc7dfe22c0d723c6e7490df9a not found: ID does not exist" Mar 19 12:30:52.591905 master-0 kubenswrapper[29612]: I0319 12:30:52.589532 29612 scope.go:117] "RemoveContainer" containerID="c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe" Mar 19 12:30:52.591905 master-0 kubenswrapper[29612]: E0319 12:30:52.589970 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe\": container with ID starting with c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe not found: ID does not exist" containerID="c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe" Mar 19 12:30:52.591905 master-0 kubenswrapper[29612]: I0319 12:30:52.590003 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe"} err="failed to get container status \"c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe\": rpc error: code = NotFound desc = could not find container \"c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe\": container with ID starting with c0990f5d2be5015864a854887f868018711af87a497b0209695649aa4d0108fe not found: ID does not exist" Mar 19 12:30:52.614034 master-0 kubenswrapper[29612]: I0319 12:30:52.613956 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:52.630755 master-0 kubenswrapper[29612]: I0319 12:30:52.630686 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:52.645331 master-0 kubenswrapper[29612]: I0319 12:30:52.645255 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:52.645971 master-0 kubenswrapper[29612]: E0319 12:30:52.645931 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-log" Mar 19 12:30:52.645971 master-0 kubenswrapper[29612]: I0319 12:30:52.645960 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-log" Mar 19 12:30:52.646072 master-0 kubenswrapper[29612]: E0319 12:30:52.645979 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-api" Mar 19 12:30:52.646072 master-0 kubenswrapper[29612]: I0319 12:30:52.645988 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-api" Mar 19 12:30:52.646355 master-0 kubenswrapper[29612]: I0319 12:30:52.646318 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-log" Mar 19 12:30:52.646410 master-0 kubenswrapper[29612]: I0319 12:30:52.646386 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" containerName="nova-api-api" Mar 19 12:30:52.648143 master-0 kubenswrapper[29612]: I0319 12:30:52.648107 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:30:52.651494 master-0 kubenswrapper[29612]: I0319 12:30:52.651410 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 12:30:52.691513 master-0 kubenswrapper[29612]: I0319 12:30:52.691449 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:52.777422 master-0 kubenswrapper[29612]: I0319 12:30:52.777306 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-config-data\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.777422 master-0 kubenswrapper[29612]: I0319 12:30:52.777384 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.777713 master-0 kubenswrapper[29612]: I0319 12:30:52.777618 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a27205b-345b-4df7-9b34-16cab1ac0665-logs\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.778045 master-0 kubenswrapper[29612]: I0319 12:30:52.778004 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6lbf\" (UniqueName: \"kubernetes.io/projected/4a27205b-345b-4df7-9b34-16cab1ac0665-kube-api-access-j6lbf\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.880786 master-0 kubenswrapper[29612]: I0319 12:30:52.880687 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6lbf\" (UniqueName: \"kubernetes.io/projected/4a27205b-345b-4df7-9b34-16cab1ac0665-kube-api-access-j6lbf\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.881283 master-0 kubenswrapper[29612]: I0319 12:30:52.880944 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-config-data\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.881283 master-0 kubenswrapper[29612]: I0319 12:30:52.881032 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.881283 master-0 kubenswrapper[29612]: I0319 12:30:52.881178 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a27205b-345b-4df7-9b34-16cab1ac0665-logs\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.882009 master-0 kubenswrapper[29612]: I0319 12:30:52.881957 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a27205b-345b-4df7-9b34-16cab1ac0665-logs\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.903118 master-0 kubenswrapper[29612]: I0319 12:30:52.902907 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 12:30:52.903118 master-0 kubenswrapper[29612]: I0319 12:30:52.903068 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.917422 master-0 kubenswrapper[29612]: I0319 12:30:52.917358 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-config-data\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.924090 master-0 kubenswrapper[29612]: I0319 12:30:52.924023 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6lbf\" (UniqueName: \"kubernetes.io/projected/4a27205b-345b-4df7-9b34-16cab1ac0665-kube-api-access-j6lbf\") pod \"nova-api-0\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " pod="openstack/nova-api-0" Mar 19 12:30:52.934601 master-0 kubenswrapper[29612]: I0319 12:30:52.934517 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03fbb8c-07ca-4577-b654-25a4d25fab85" path="/var/lib/kubelet/pods/b03fbb8c-07ca-4577-b654-25a4d25fab85/volumes" Mar 19 12:30:52.974766 master-0 kubenswrapper[29612]: I0319 12:30:52.974627 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:30:53.272385 master-0 kubenswrapper[29612]: E0319 12:30:53.272310 29612 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0982f57_e690_4df2_b3fe_23db650e0014.slice/crio-0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0982f57_e690_4df2_b3fe_23db650e0014.slice/crio-conmon-0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0.scope\": RecentStats: unable to find data in memory cache]" Mar 19 12:30:53.561400 master-0 kubenswrapper[29612]: I0319 12:30:53.561344 29612 generic.go:334] "Generic (PLEG): container finished" podID="f0982f57-e690-4df2-b3fe-23db650e0014" containerID="0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0" exitCode=0 Mar 19 12:30:53.561700 master-0 kubenswrapper[29612]: I0319 12:30:53.561653 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0982f57-e690-4df2-b3fe-23db650e0014","Type":"ContainerDied","Data":"0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0"} Mar 19 12:30:53.564369 master-0 kubenswrapper[29612]: I0319 12:30:53.564348 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 19 12:30:53.649556 master-0 kubenswrapper[29612]: I0319 12:30:53.649497 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:30:53.697830 master-0 kubenswrapper[29612]: I0319 12:30:53.697768 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:30:53.698338 master-0 kubenswrapper[29612]: W0319 12:30:53.698242 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a27205b_345b_4df7_9b34_16cab1ac0665.slice/crio-1502246dff981f15ceba71f48dd51517bc7ce0e09f1300ea0928e83b8b67aa8b WatchSource:0}: Error finding container 1502246dff981f15ceba71f48dd51517bc7ce0e09f1300ea0928e83b8b67aa8b: Status 404 returned error can't find the container with id 1502246dff981f15ceba71f48dd51517bc7ce0e09f1300ea0928e83b8b67aa8b Mar 19 12:30:54.582807 master-0 kubenswrapper[29612]: I0319 12:30:54.582592 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a27205b-345b-4df7-9b34-16cab1ac0665","Type":"ContainerStarted","Data":"2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be"} Mar 19 12:30:54.582807 master-0 kubenswrapper[29612]: I0319 12:30:54.582675 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a27205b-345b-4df7-9b34-16cab1ac0665","Type":"ContainerStarted","Data":"1502246dff981f15ceba71f48dd51517bc7ce0e09f1300ea0928e83b8b67aa8b"} Mar 19 12:30:54.586423 master-0 kubenswrapper[29612]: I0319 12:30:54.586297 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:30:54.586720 master-0 kubenswrapper[29612]: I0319 12:30:54.586604 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f0982f57-e690-4df2-b3fe-23db650e0014","Type":"ContainerDied","Data":"e4a8491e13534c7abdf9f7342db8e5ca751b272572e7c0c25799966e7f4eb3ed"} Mar 19 12:30:54.586882 master-0 kubenswrapper[29612]: I0319 12:30:54.586774 29612 scope.go:117] "RemoveContainer" containerID="0b9a9ca762a0e7e62a37160c08655fe2847e191229eb3bd61743bbdf1c0caad0" Mar 19 12:30:55.048821 master-0 kubenswrapper[29612]: I0319 12:30:55.048746 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-combined-ca-bundle\") pod \"f0982f57-e690-4df2-b3fe-23db650e0014\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " Mar 19 12:30:55.049080 master-0 kubenswrapper[29612]: I0319 12:30:55.049045 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwrk\" (UniqueName: \"kubernetes.io/projected/f0982f57-e690-4df2-b3fe-23db650e0014-kube-api-access-qlwrk\") pod \"f0982f57-e690-4df2-b3fe-23db650e0014\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " Mar 19 12:30:55.049210 master-0 kubenswrapper[29612]: I0319 12:30:55.049181 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-config-data\") pod \"f0982f57-e690-4df2-b3fe-23db650e0014\" (UID: \"f0982f57-e690-4df2-b3fe-23db650e0014\") " Mar 19 12:30:55.057684 master-0 kubenswrapper[29612]: I0319 12:30:55.056747 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0982f57-e690-4df2-b3fe-23db650e0014-kube-api-access-qlwrk" (OuterVolumeSpecName: "kube-api-access-qlwrk") pod "f0982f57-e690-4df2-b3fe-23db650e0014" (UID: "f0982f57-e690-4df2-b3fe-23db650e0014"). InnerVolumeSpecName "kube-api-access-qlwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:30:55.085837 master-0 kubenswrapper[29612]: I0319 12:30:55.082765 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0982f57-e690-4df2-b3fe-23db650e0014" (UID: "f0982f57-e690-4df2-b3fe-23db650e0014"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:55.090762 master-0 kubenswrapper[29612]: I0319 12:30:55.090382 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-config-data" (OuterVolumeSpecName: "config-data") pod "f0982f57-e690-4df2-b3fe-23db650e0014" (UID: "f0982f57-e690-4df2-b3fe-23db650e0014"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:30:55.154315 master-0 kubenswrapper[29612]: I0319 12:30:55.154214 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwrk\" (UniqueName: \"kubernetes.io/projected/f0982f57-e690-4df2-b3fe-23db650e0014-kube-api-access-qlwrk\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:55.154533 master-0 kubenswrapper[29612]: I0319 12:30:55.154422 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:55.154533 master-0 kubenswrapper[29612]: I0319 12:30:55.154499 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0982f57-e690-4df2-b3fe-23db650e0014-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:30:55.261739 master-0 kubenswrapper[29612]: I0319 12:30:55.261589 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 12:30:55.531988 master-0 kubenswrapper[29612]: I0319 12:30:55.531819 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:55.562525 master-0 kubenswrapper[29612]: I0319 12:30:55.562450 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:55.601134 master-0 kubenswrapper[29612]: I0319 12:30:55.601063 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a27205b-345b-4df7-9b34-16cab1ac0665","Type":"ContainerStarted","Data":"c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66"} Mar 19 12:30:55.694804 master-0 kubenswrapper[29612]: I0319 12:30:55.694714 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:55.695745 master-0 kubenswrapper[29612]: E0319 12:30:55.695701 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0982f57-e690-4df2-b3fe-23db650e0014" containerName="nova-scheduler-scheduler" Mar 19 12:30:55.695745 master-0 kubenswrapper[29612]: I0319 12:30:55.695741 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0982f57-e690-4df2-b3fe-23db650e0014" containerName="nova-scheduler-scheduler" Mar 19 12:30:55.696325 master-0 kubenswrapper[29612]: I0319 12:30:55.696269 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0982f57-e690-4df2-b3fe-23db650e0014" containerName="nova-scheduler-scheduler" Mar 19 12:30:55.697481 master-0 kubenswrapper[29612]: I0319 12:30:55.697438 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:30:55.699796 master-0 kubenswrapper[29612]: I0319 12:30:55.699748 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 12:30:55.889921 master-0 kubenswrapper[29612]: I0319 12:30:55.877550 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.889921 master-0 kubenswrapper[29612]: I0319 12:30:55.877739 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbtw4\" (UniqueName: \"kubernetes.io/projected/24df43e9-37cb-4632-9a19-5b602c2742f9-kube-api-access-qbtw4\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.889921 master-0 kubenswrapper[29612]: I0319 12:30:55.878207 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-config-data\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.910428 master-0 kubenswrapper[29612]: I0319 12:30:55.910324 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:55.918765 master-0 kubenswrapper[29612]: I0319 12:30:55.918615 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.9185862179999997 podStartE2EDuration="3.918586218s" podCreationTimestamp="2026-03-19 12:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:55.861255829 +0000 UTC m=+1263.175424860" watchObservedRunningTime="2026-03-19 12:30:55.918586218 +0000 UTC m=+1263.232755249" Mar 19 12:30:55.981491 master-0 kubenswrapper[29612]: I0319 12:30:55.981385 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.981491 master-0 kubenswrapper[29612]: I0319 12:30:55.981492 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbtw4\" (UniqueName: \"kubernetes.io/projected/24df43e9-37cb-4632-9a19-5b602c2742f9-kube-api-access-qbtw4\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.981936 master-0 kubenswrapper[29612]: I0319 12:30:55.981571 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-config-data\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.986801 master-0 kubenswrapper[29612]: I0319 12:30:55.986149 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:55.988773 master-0 kubenswrapper[29612]: I0319 12:30:55.988722 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-config-data\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:56.002212 master-0 kubenswrapper[29612]: I0319 12:30:56.002176 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbtw4\" (UniqueName: \"kubernetes.io/projected/24df43e9-37cb-4632-9a19-5b602c2742f9-kube-api-access-qbtw4\") pod \"nova-scheduler-0\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " pod="openstack/nova-scheduler-0" Mar 19 12:30:56.014995 master-0 kubenswrapper[29612]: I0319 12:30:56.014932 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:30:56.564209 master-0 kubenswrapper[29612]: I0319 12:30:56.564133 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:30:56.622936 master-0 kubenswrapper[29612]: I0319 12:30:56.622878 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"24df43e9-37cb-4632-9a19-5b602c2742f9","Type":"ContainerStarted","Data":"d62422ca53036bdda1ec3d5434189c53bd65ef1e6a088cb87aa77c24ed320ec8"} Mar 19 12:30:56.927263 master-0 kubenswrapper[29612]: I0319 12:30:56.927191 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0982f57-e690-4df2-b3fe-23db650e0014" path="/var/lib/kubelet/pods/f0982f57-e690-4df2-b3fe-23db650e0014/volumes" Mar 19 12:30:57.642613 master-0 kubenswrapper[29612]: I0319 12:30:57.642529 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"24df43e9-37cb-4632-9a19-5b602c2742f9","Type":"ContainerStarted","Data":"3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057"} Mar 19 12:30:57.670123 master-0 kubenswrapper[29612]: I0319 12:30:57.670019 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.669998137 podStartE2EDuration="2.669998137s" podCreationTimestamp="2026-03-19 12:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:30:57.664540312 +0000 UTC m=+1264.978709333" watchObservedRunningTime="2026-03-19 12:30:57.669998137 +0000 UTC m=+1264.984167168" Mar 19 12:31:01.016890 master-0 kubenswrapper[29612]: I0319 12:31:01.016843 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 12:31:02.976453 master-0 kubenswrapper[29612]: I0319 12:31:02.976386 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 12:31:02.976453 master-0 kubenswrapper[29612]: I0319 12:31:02.976459 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 12:31:04.059557 master-0 kubenswrapper[29612]: I0319 12:31:04.058984 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:04.059557 master-0 kubenswrapper[29612]: I0319 12:31:04.059437 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:06.017058 master-0 kubenswrapper[29612]: I0319 12:31:06.016981 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 12:31:06.055140 master-0 kubenswrapper[29612]: I0319 12:31:06.055088 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 12:31:06.788501 master-0 kubenswrapper[29612]: I0319 12:31:06.788451 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 12:31:07.792022 master-0 kubenswrapper[29612]: I0319 12:31:07.791870 29612 generic.go:334] "Generic (PLEG): container finished" podID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerID="2f7337e851369b1c5f42d5d6eb94e324968de5af0f17226f45b89236d33a1bf0" exitCode=137 Mar 19 12:31:07.792022 master-0 kubenswrapper[29612]: I0319 12:31:07.791976 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df4b1340-dfba-4400-8dbe-452dc3de07c9","Type":"ContainerDied","Data":"2f7337e851369b1c5f42d5d6eb94e324968de5af0f17226f45b89236d33a1bf0"} Mar 19 12:31:07.794154 master-0 kubenswrapper[29612]: I0319 12:31:07.794102 29612 generic.go:334] "Generic (PLEG): container finished" podID="8b676207-2926-4cb6-8f24-14859eee1d2f" containerID="72547e021b6a18944aa444e1fc1d70a6067ef6defd511fcb32e780050b71ce9a" exitCode=137 Mar 19 12:31:07.794291 master-0 kubenswrapper[29612]: I0319 12:31:07.794169 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b676207-2926-4cb6-8f24-14859eee1d2f","Type":"ContainerDied","Data":"72547e021b6a18944aa444e1fc1d70a6067ef6defd511fcb32e780050b71ce9a"} Mar 19 12:31:07.794291 master-0 kubenswrapper[29612]: I0319 12:31:07.794242 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b676207-2926-4cb6-8f24-14859eee1d2f","Type":"ContainerDied","Data":"ceecd53084343b4b82dc1f26d6c592bbdc27b30b0ffa93d66d252f9a0c3d7fe8"} Mar 19 12:31:07.794291 master-0 kubenswrapper[29612]: I0319 12:31:07.794261 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceecd53084343b4b82dc1f26d6c592bbdc27b30b0ffa93d66d252f9a0c3d7fe8" Mar 19 12:31:07.831977 master-0 kubenswrapper[29612]: I0319 12:31:07.831921 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:07.843512 master-0 kubenswrapper[29612]: I0319 12:31:07.843467 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:07.891480 master-0 kubenswrapper[29612]: I0319 12:31:07.891007 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-combined-ca-bundle\") pod \"8b676207-2926-4cb6-8f24-14859eee1d2f\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " Mar 19 12:31:07.891480 master-0 kubenswrapper[29612]: I0319 12:31:07.891077 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt2cw\" (UniqueName: \"kubernetes.io/projected/8b676207-2926-4cb6-8f24-14859eee1d2f-kube-api-access-jt2cw\") pod \"8b676207-2926-4cb6-8f24-14859eee1d2f\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " Mar 19 12:31:07.891480 master-0 kubenswrapper[29612]: I0319 12:31:07.891247 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-config-data\") pod \"8b676207-2926-4cb6-8f24-14859eee1d2f\" (UID: \"8b676207-2926-4cb6-8f24-14859eee1d2f\") " Mar 19 12:31:07.897277 master-0 kubenswrapper[29612]: I0319 12:31:07.897189 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b676207-2926-4cb6-8f24-14859eee1d2f-kube-api-access-jt2cw" (OuterVolumeSpecName: "kube-api-access-jt2cw") pod "8b676207-2926-4cb6-8f24-14859eee1d2f" (UID: "8b676207-2926-4cb6-8f24-14859eee1d2f"). InnerVolumeSpecName "kube-api-access-jt2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:07.926499 master-0 kubenswrapper[29612]: I0319 12:31:07.926292 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b676207-2926-4cb6-8f24-14859eee1d2f" (UID: "8b676207-2926-4cb6-8f24-14859eee1d2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:07.938051 master-0 kubenswrapper[29612]: I0319 12:31:07.937994 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-config-data" (OuterVolumeSpecName: "config-data") pod "8b676207-2926-4cb6-8f24-14859eee1d2f" (UID: "8b676207-2926-4cb6-8f24-14859eee1d2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:07.993709 master-0 kubenswrapper[29612]: I0319 12:31:07.993621 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh2gj\" (UniqueName: \"kubernetes.io/projected/df4b1340-dfba-4400-8dbe-452dc3de07c9-kube-api-access-nh2gj\") pod \"df4b1340-dfba-4400-8dbe-452dc3de07c9\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " Mar 19 12:31:07.993709 master-0 kubenswrapper[29612]: I0319 12:31:07.993713 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-combined-ca-bundle\") pod \"df4b1340-dfba-4400-8dbe-452dc3de07c9\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " Mar 19 12:31:07.994006 master-0 kubenswrapper[29612]: I0319 12:31:07.993767 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4b1340-dfba-4400-8dbe-452dc3de07c9-logs\") pod \"df4b1340-dfba-4400-8dbe-452dc3de07c9\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " Mar 19 12:31:07.994006 master-0 kubenswrapper[29612]: I0319 12:31:07.993856 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-config-data\") pod \"df4b1340-dfba-4400-8dbe-452dc3de07c9\" (UID: \"df4b1340-dfba-4400-8dbe-452dc3de07c9\") " Mar 19 12:31:07.994282 master-0 kubenswrapper[29612]: I0319 12:31:07.994227 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4b1340-dfba-4400-8dbe-452dc3de07c9-logs" (OuterVolumeSpecName: "logs") pod "df4b1340-dfba-4400-8dbe-452dc3de07c9" (UID: "df4b1340-dfba-4400-8dbe-452dc3de07c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:31:07.995598 master-0 kubenswrapper[29612]: I0319 12:31:07.995392 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:07.995598 master-0 kubenswrapper[29612]: I0319 12:31:07.995421 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b676207-2926-4cb6-8f24-14859eee1d2f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:07.995598 master-0 kubenswrapper[29612]: I0319 12:31:07.995437 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt2cw\" (UniqueName: \"kubernetes.io/projected/8b676207-2926-4cb6-8f24-14859eee1d2f-kube-api-access-jt2cw\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:07.995598 master-0 kubenswrapper[29612]: I0319 12:31:07.995451 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4b1340-dfba-4400-8dbe-452dc3de07c9-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:07.998310 master-0 kubenswrapper[29612]: I0319 12:31:07.998245 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4b1340-dfba-4400-8dbe-452dc3de07c9-kube-api-access-nh2gj" (OuterVolumeSpecName: "kube-api-access-nh2gj") pod "df4b1340-dfba-4400-8dbe-452dc3de07c9" (UID: "df4b1340-dfba-4400-8dbe-452dc3de07c9"). InnerVolumeSpecName "kube-api-access-nh2gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:08.019186 master-0 kubenswrapper[29612]: I0319 12:31:08.019114 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df4b1340-dfba-4400-8dbe-452dc3de07c9" (UID: "df4b1340-dfba-4400-8dbe-452dc3de07c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:08.023932 master-0 kubenswrapper[29612]: I0319 12:31:08.023692 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-config-data" (OuterVolumeSpecName: "config-data") pod "df4b1340-dfba-4400-8dbe-452dc3de07c9" (UID: "df4b1340-dfba-4400-8dbe-452dc3de07c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:08.098183 master-0 kubenswrapper[29612]: I0319 12:31:08.098035 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:08.098183 master-0 kubenswrapper[29612]: I0319 12:31:08.098127 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh2gj\" (UniqueName: \"kubernetes.io/projected/df4b1340-dfba-4400-8dbe-452dc3de07c9-kube-api-access-nh2gj\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:08.098183 master-0 kubenswrapper[29612]: I0319 12:31:08.098140 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4b1340-dfba-4400-8dbe-452dc3de07c9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:08.817549 master-0 kubenswrapper[29612]: I0319 12:31:08.817420 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:08.817549 master-0 kubenswrapper[29612]: I0319 12:31:08.817454 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:08.817549 master-0 kubenswrapper[29612]: I0319 12:31:08.817429 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df4b1340-dfba-4400-8dbe-452dc3de07c9","Type":"ContainerDied","Data":"344496cc0876bee32e78de20cebcf99195e9d582588bf32a32218fa93483e024"} Mar 19 12:31:08.818709 master-0 kubenswrapper[29612]: I0319 12:31:08.817608 29612 scope.go:117] "RemoveContainer" containerID="2f7337e851369b1c5f42d5d6eb94e324968de5af0f17226f45b89236d33a1bf0" Mar 19 12:31:08.859958 master-0 kubenswrapper[29612]: I0319 12:31:08.859864 29612 scope.go:117] "RemoveContainer" containerID="05a46a57e057a1a613f08be378ad64d6ca4bc9a2a6c8882b1cd77d4c3a46651d" Mar 19 12:31:08.953683 master-0 kubenswrapper[29612]: I0319 12:31:08.952716 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:31:08.962679 master-0 kubenswrapper[29612]: I0319 12:31:08.958713 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:31:09.000659 master-0 kubenswrapper[29612]: I0319 12:31:08.996585 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:09.011582 master-0 kubenswrapper[29612]: I0319 12:31:09.011498 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:09.028612 master-0 kubenswrapper[29612]: I0319 12:31:09.028559 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:31:09.029242 master-0 kubenswrapper[29612]: E0319 12:31:09.029217 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b676207-2926-4cb6-8f24-14859eee1d2f" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 12:31:09.029242 master-0 kubenswrapper[29612]: I0319 12:31:09.029236 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b676207-2926-4cb6-8f24-14859eee1d2f" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 12:31:09.029332 master-0 kubenswrapper[29612]: E0319 12:31:09.029275 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-log" Mar 19 12:31:09.029332 master-0 kubenswrapper[29612]: I0319 12:31:09.029284 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-log" Mar 19 12:31:09.029332 master-0 kubenswrapper[29612]: E0319 12:31:09.029316 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-metadata" Mar 19 12:31:09.029332 master-0 kubenswrapper[29612]: I0319 12:31:09.029322 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-metadata" Mar 19 12:31:09.029713 master-0 kubenswrapper[29612]: I0319 12:31:09.029690 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-metadata" Mar 19 12:31:09.029782 master-0 kubenswrapper[29612]: I0319 12:31:09.029748 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b676207-2926-4cb6-8f24-14859eee1d2f" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 12:31:09.029782 master-0 kubenswrapper[29612]: I0319 12:31:09.029764 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" containerName="nova-metadata-log" Mar 19 12:31:09.030873 master-0 kubenswrapper[29612]: I0319 12:31:09.030848 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.034195 master-0 kubenswrapper[29612]: I0319 12:31:09.032695 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 12:31:09.034195 master-0 kubenswrapper[29612]: I0319 12:31:09.032702 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 12:31:09.034195 master-0 kubenswrapper[29612]: I0319 12:31:09.033059 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 12:31:09.046285 master-0 kubenswrapper[29612]: I0319 12:31:09.046238 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:31:09.065524 master-0 kubenswrapper[29612]: I0319 12:31:09.065434 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:09.071236 master-0 kubenswrapper[29612]: I0319 12:31:09.071108 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:09.074099 master-0 kubenswrapper[29612]: I0319 12:31:09.074049 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 12:31:09.075438 master-0 kubenswrapper[29612]: I0319 12:31:09.075417 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 12:31:09.090048 master-0 kubenswrapper[29612]: I0319 12:31:09.089983 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:09.139433 master-0 kubenswrapper[29612]: I0319 12:31:09.139342 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.139433 master-0 kubenswrapper[29612]: I0319 12:31:09.139431 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p759\" (UniqueName: \"kubernetes.io/projected/b79854e6-8240-4e80-83d9-67178212fbd5-kube-api-access-9p759\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.139743 master-0 kubenswrapper[29612]: I0319 12:31:09.139540 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.139743 master-0 kubenswrapper[29612]: I0319 12:31:09.139577 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.139743 master-0 kubenswrapper[29612]: I0319 12:31:09.139610 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.139961 master-0 kubenswrapper[29612]: I0319 12:31:09.139911 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bgq2\" (UniqueName: \"kubernetes.io/projected/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-kube-api-access-7bgq2\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.140163 master-0 kubenswrapper[29612]: I0319 12:31:09.140128 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-logs\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.140222 master-0 kubenswrapper[29612]: I0319 12:31:09.140204 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-config-data\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.140365 master-0 kubenswrapper[29612]: I0319 12:31:09.140340 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.140435 master-0 kubenswrapper[29612]: I0319 12:31:09.140418 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.242162 master-0 kubenswrapper[29612]: I0319 12:31:09.242098 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-logs\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.242162 master-0 kubenswrapper[29612]: I0319 12:31:09.242165 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-config-data\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.242389 master-0 kubenswrapper[29612]: I0319 12:31:09.242204 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.242389 master-0 kubenswrapper[29612]: I0319 12:31:09.242233 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.242389 master-0 kubenswrapper[29612]: I0319 12:31:09.242275 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.242389 master-0 kubenswrapper[29612]: I0319 12:31:09.242307 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p759\" (UniqueName: \"kubernetes.io/projected/b79854e6-8240-4e80-83d9-67178212fbd5-kube-api-access-9p759\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.242389 master-0 kubenswrapper[29612]: I0319 12:31:09.242376 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.242544 master-0 kubenswrapper[29612]: I0319 12:31:09.242396 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.242544 master-0 kubenswrapper[29612]: I0319 12:31:09.242419 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.242544 master-0 kubenswrapper[29612]: I0319 12:31:09.242464 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bgq2\" (UniqueName: \"kubernetes.io/projected/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-kube-api-access-7bgq2\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.242544 master-0 kubenswrapper[29612]: I0319 12:31:09.242524 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-logs\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.245950 master-0 kubenswrapper[29612]: I0319 12:31:09.245908 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.246772 master-0 kubenswrapper[29612]: I0319 12:31:09.246742 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.248182 master-0 kubenswrapper[29612]: I0319 12:31:09.248119 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.248510 master-0 kubenswrapper[29612]: I0319 12:31:09.248440 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-config-data\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.248547 master-0 kubenswrapper[29612]: I0319 12:31:09.248500 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.252148 master-0 kubenswrapper[29612]: I0319 12:31:09.252130 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b79854e6-8240-4e80-83d9-67178212fbd5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.257728 master-0 kubenswrapper[29612]: I0319 12:31:09.257666 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.261354 master-0 kubenswrapper[29612]: I0319 12:31:09.261322 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bgq2\" (UniqueName: \"kubernetes.io/projected/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-kube-api-access-7bgq2\") pod \"nova-metadata-0\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " pod="openstack/nova-metadata-0" Mar 19 12:31:09.262970 master-0 kubenswrapper[29612]: I0319 12:31:09.262924 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p759\" (UniqueName: \"kubernetes.io/projected/b79854e6-8240-4e80-83d9-67178212fbd5-kube-api-access-9p759\") pod \"nova-cell1-novncproxy-0\" (UID: \"b79854e6-8240-4e80-83d9-67178212fbd5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.353905 master-0 kubenswrapper[29612]: I0319 12:31:09.353707 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:09.392838 master-0 kubenswrapper[29612]: I0319 12:31:09.392776 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:10.035057 master-0 kubenswrapper[29612]: I0319 12:31:10.034936 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 12:31:10.035911 master-0 kubenswrapper[29612]: W0319 12:31:10.035845 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb79854e6_8240_4e80_83d9_67178212fbd5.slice/crio-6a0a585ff34a9f8a8de6efa6cc8095965a862abd5d61f3a9a1b643cd57bc330f WatchSource:0}: Error finding container 6a0a585ff34a9f8a8de6efa6cc8095965a862abd5d61f3a9a1b643cd57bc330f: Status 404 returned error can't find the container with id 6a0a585ff34a9f8a8de6efa6cc8095965a862abd5d61f3a9a1b643cd57bc330f Mar 19 12:31:10.162712 master-0 kubenswrapper[29612]: W0319 12:31:10.162659 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef802a6a_c865_4c30_935a_d8c7f9c85e2d.slice/crio-e010e3c1aca27ca954d8a0264f18882c4ff23af6297bfcd73228d02041ce39d6 WatchSource:0}: Error finding container e010e3c1aca27ca954d8a0264f18882c4ff23af6297bfcd73228d02041ce39d6: Status 404 returned error can't find the container with id e010e3c1aca27ca954d8a0264f18882c4ff23af6297bfcd73228d02041ce39d6 Mar 19 12:31:10.171232 master-0 kubenswrapper[29612]: I0319 12:31:10.171159 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:10.858663 master-0 kubenswrapper[29612]: I0319 12:31:10.857316 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b79854e6-8240-4e80-83d9-67178212fbd5","Type":"ContainerStarted","Data":"59dc70fb3da1bb7b64532239f0c545007b2066c6b50ca9373c0c0a1706a457f7"} Mar 19 12:31:10.858663 master-0 kubenswrapper[29612]: I0319 12:31:10.857387 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b79854e6-8240-4e80-83d9-67178212fbd5","Type":"ContainerStarted","Data":"6a0a585ff34a9f8a8de6efa6cc8095965a862abd5d61f3a9a1b643cd57bc330f"} Mar 19 12:31:10.870374 master-0 kubenswrapper[29612]: I0319 12:31:10.870140 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef802a6a-c865-4c30-935a-d8c7f9c85e2d","Type":"ContainerStarted","Data":"adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d"} Mar 19 12:31:10.870374 master-0 kubenswrapper[29612]: I0319 12:31:10.870311 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef802a6a-c865-4c30-935a-d8c7f9c85e2d","Type":"ContainerStarted","Data":"6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703"} Mar 19 12:31:10.870374 master-0 kubenswrapper[29612]: I0319 12:31:10.870328 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef802a6a-c865-4c30-935a-d8c7f9c85e2d","Type":"ContainerStarted","Data":"e010e3c1aca27ca954d8a0264f18882c4ff23af6297bfcd73228d02041ce39d6"} Mar 19 12:31:10.890490 master-0 kubenswrapper[29612]: I0319 12:31:10.887166 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.887141323 podStartE2EDuration="2.887141323s" podCreationTimestamp="2026-03-19 12:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:10.881458772 +0000 UTC m=+1278.195627793" watchObservedRunningTime="2026-03-19 12:31:10.887141323 +0000 UTC m=+1278.201310354" Mar 19 12:31:10.918654 master-0 kubenswrapper[29612]: I0319 12:31:10.916460 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.916437575 podStartE2EDuration="2.916437575s" podCreationTimestamp="2026-03-19 12:31:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:10.900335498 +0000 UTC m=+1278.214504519" watchObservedRunningTime="2026-03-19 12:31:10.916437575 +0000 UTC m=+1278.230606596" Mar 19 12:31:10.930546 master-0 kubenswrapper[29612]: I0319 12:31:10.930481 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b676207-2926-4cb6-8f24-14859eee1d2f" path="/var/lib/kubelet/pods/8b676207-2926-4cb6-8f24-14859eee1d2f/volumes" Mar 19 12:31:10.931151 master-0 kubenswrapper[29612]: I0319 12:31:10.931114 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4b1340-dfba-4400-8dbe-452dc3de07c9" path="/var/lib/kubelet/pods/df4b1340-dfba-4400-8dbe-452dc3de07c9/volumes" Mar 19 12:31:10.975814 master-0 kubenswrapper[29612]: I0319 12:31:10.975749 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 12:31:10.976055 master-0 kubenswrapper[29612]: I0319 12:31:10.975854 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 12:31:12.979763 master-0 kubenswrapper[29612]: I0319 12:31:12.979712 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 12:31:12.983832 master-0 kubenswrapper[29612]: I0319 12:31:12.983766 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 12:31:13.001808 master-0 kubenswrapper[29612]: I0319 12:31:13.001752 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 12:31:13.914973 master-0 kubenswrapper[29612]: I0319 12:31:13.914894 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 12:31:14.280179 master-0 kubenswrapper[29612]: I0319 12:31:14.279835 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79c4b5fc8f-psc2c"] Mar 19 12:31:14.284143 master-0 kubenswrapper[29612]: I0319 12:31:14.284085 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.300699 master-0 kubenswrapper[29612]: I0319 12:31:14.299628 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c4b5fc8f-psc2c"] Mar 19 12:31:14.358237 master-0 kubenswrapper[29612]: I0319 12:31:14.358174 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:14.428569 master-0 kubenswrapper[29612]: I0319 12:31:14.428512 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-ovsdbserver-nb\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.428831 master-0 kubenswrapper[29612]: I0319 12:31:14.428664 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-ovsdbserver-sb\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.428831 master-0 kubenswrapper[29612]: I0319 12:31:14.428707 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-dns-swift-storage-0\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.428831 master-0 kubenswrapper[29612]: I0319 12:31:14.428744 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-config\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.428831 master-0 kubenswrapper[29612]: I0319 12:31:14.428775 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-dns-svc\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.428831 master-0 kubenswrapper[29612]: I0319 12:31:14.428812 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5pzs\" (UniqueName: \"kubernetes.io/projected/22b592d7-0904-4cae-933c-421265ad057e-kube-api-access-j5pzs\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.530780 master-0 kubenswrapper[29612]: I0319 12:31:14.530610 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-config\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.530780 master-0 kubenswrapper[29612]: I0319 12:31:14.530745 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-dns-svc\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.531085 master-0 kubenswrapper[29612]: I0319 12:31:14.530805 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5pzs\" (UniqueName: \"kubernetes.io/projected/22b592d7-0904-4cae-933c-421265ad057e-kube-api-access-j5pzs\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.531085 master-0 kubenswrapper[29612]: I0319 12:31:14.530916 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-ovsdbserver-nb\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.531085 master-0 kubenswrapper[29612]: I0319 12:31:14.531060 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-ovsdbserver-sb\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.531299 master-0 kubenswrapper[29612]: I0319 12:31:14.531252 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-dns-swift-storage-0\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.532361 master-0 kubenswrapper[29612]: I0319 12:31:14.532315 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-dns-swift-storage-0\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.532459 master-0 kubenswrapper[29612]: I0319 12:31:14.532417 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-config\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.533236 master-0 kubenswrapper[29612]: I0319 12:31:14.533184 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-dns-svc\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.534247 master-0 kubenswrapper[29612]: I0319 12:31:14.533752 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-ovsdbserver-sb\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.534247 master-0 kubenswrapper[29612]: I0319 12:31:14.533911 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22b592d7-0904-4cae-933c-421265ad057e-ovsdbserver-nb\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.553167 master-0 kubenswrapper[29612]: I0319 12:31:14.553100 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5pzs\" (UniqueName: \"kubernetes.io/projected/22b592d7-0904-4cae-933c-421265ad057e-kube-api-access-j5pzs\") pod \"dnsmasq-dns-79c4b5fc8f-psc2c\" (UID: \"22b592d7-0904-4cae-933c-421265ad057e\") " pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:14.620516 master-0 kubenswrapper[29612]: I0319 12:31:14.620451 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:15.220294 master-0 kubenswrapper[29612]: W0319 12:31:15.220229 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22b592d7_0904_4cae_933c_421265ad057e.slice/crio-2d6f15599c6f5cc7ab985d3f724f486a383bd68c0c4f8c6a9ee8bb02a483928b WatchSource:0}: Error finding container 2d6f15599c6f5cc7ab985d3f724f486a383bd68c0c4f8c6a9ee8bb02a483928b: Status 404 returned error can't find the container with id 2d6f15599c6f5cc7ab985d3f724f486a383bd68c0c4f8c6a9ee8bb02a483928b Mar 19 12:31:15.233110 master-0 kubenswrapper[29612]: I0319 12:31:15.233038 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c4b5fc8f-psc2c"] Mar 19 12:31:15.940234 master-0 kubenswrapper[29612]: I0319 12:31:15.939277 29612 generic.go:334] "Generic (PLEG): container finished" podID="22b592d7-0904-4cae-933c-421265ad057e" containerID="74445e13e37f359a9928787040b81e3dfc6d9b7eae386e77f058e57ecece8e3c" exitCode=0 Mar 19 12:31:15.940234 master-0 kubenswrapper[29612]: I0319 12:31:15.939410 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" event={"ID":"22b592d7-0904-4cae-933c-421265ad057e","Type":"ContainerDied","Data":"74445e13e37f359a9928787040b81e3dfc6d9b7eae386e77f058e57ecece8e3c"} Mar 19 12:31:15.940234 master-0 kubenswrapper[29612]: I0319 12:31:15.939442 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" event={"ID":"22b592d7-0904-4cae-933c-421265ad057e","Type":"ContainerStarted","Data":"2d6f15599c6f5cc7ab985d3f724f486a383bd68c0c4f8c6a9ee8bb02a483928b"} Mar 19 12:31:16.950523 master-0 kubenswrapper[29612]: I0319 12:31:16.950463 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" event={"ID":"22b592d7-0904-4cae-933c-421265ad057e","Type":"ContainerStarted","Data":"8c4258bbedcaaf0b62168d050843c87e2d95bdad29d81645635524498f83c53f"} Mar 19 12:31:16.951091 master-0 kubenswrapper[29612]: I0319 12:31:16.950619 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:16.978664 master-0 kubenswrapper[29612]: I0319 12:31:16.978566 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" podStartSLOduration=2.97854575 podStartE2EDuration="2.97854575s" podCreationTimestamp="2026-03-19 12:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:16.967554637 +0000 UTC m=+1284.281723658" watchObservedRunningTime="2026-03-19 12:31:16.97854575 +0000 UTC m=+1284.292714771" Mar 19 12:31:17.173202 master-0 kubenswrapper[29612]: I0319 12:31:17.173129 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:17.173455 master-0 kubenswrapper[29612]: I0319 12:31:17.173417 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-log" containerID="cri-o://2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be" gracePeriod=30 Mar 19 12:31:17.173573 master-0 kubenswrapper[29612]: I0319 12:31:17.173467 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-api" containerID="cri-o://c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66" gracePeriod=30 Mar 19 12:31:17.965024 master-0 kubenswrapper[29612]: I0319 12:31:17.964956 29612 generic.go:334] "Generic (PLEG): container finished" podID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerID="2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be" exitCode=143 Mar 19 12:31:17.965565 master-0 kubenswrapper[29612]: I0319 12:31:17.965032 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a27205b-345b-4df7-9b34-16cab1ac0665","Type":"ContainerDied","Data":"2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be"} Mar 19 12:31:19.355837 master-0 kubenswrapper[29612]: I0319 12:31:19.355303 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:19.393181 master-0 kubenswrapper[29612]: I0319 12:31:19.393080 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 12:31:19.394601 master-0 kubenswrapper[29612]: I0319 12:31:19.393196 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 12:31:19.398993 master-0 kubenswrapper[29612]: I0319 12:31:19.398922 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:20.014871 master-0 kubenswrapper[29612]: I0319 12:31:20.014829 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 12:31:20.233725 master-0 kubenswrapper[29612]: I0319 12:31:20.233512 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xd7zk"] Mar 19 12:31:20.235985 master-0 kubenswrapper[29612]: I0319 12:31:20.235280 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.238595 master-0 kubenswrapper[29612]: I0319 12:31:20.238430 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 12:31:20.238772 master-0 kubenswrapper[29612]: I0319 12:31:20.238703 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 12:31:20.247051 master-0 kubenswrapper[29612]: I0319 12:31:20.246936 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-w9c2p"] Mar 19 12:31:20.262058 master-0 kubenswrapper[29612]: I0319 12:31:20.250216 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.278855 master-0 kubenswrapper[29612]: I0319 12:31:20.278793 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xd7zk"] Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312203 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312349 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-config-data\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312510 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-config-data\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312583 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-scripts\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312725 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-scripts\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312837 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-combined-ca-bundle\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.312938 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzw2\" (UniqueName: \"kubernetes.io/projected/2a53c820-34b9-4c07-a724-063e01459e1f-kube-api-access-6pzw2\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.317469 master-0 kubenswrapper[29612]: I0319 12:31:20.313073 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tmz\" (UniqueName: \"kubernetes.io/projected/261954af-1d89-4d6d-a269-7407af2c533c-kube-api-access-s7tmz\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.348399 master-0 kubenswrapper[29612]: I0319 12:31:20.348338 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-w9c2p"] Mar 19 12:31:20.413932 master-0 kubenswrapper[29612]: I0319 12:31:20.413865 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:20.416083 master-0 kubenswrapper[29612]: I0319 12:31:20.415822 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:20.416518 master-0 kubenswrapper[29612]: I0319 12:31:20.416465 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tmz\" (UniqueName: \"kubernetes.io/projected/261954af-1d89-4d6d-a269-7407af2c533c-kube-api-access-s7tmz\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.416732 master-0 kubenswrapper[29612]: I0319 12:31:20.416695 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.416892 master-0 kubenswrapper[29612]: I0319 12:31:20.416785 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-config-data\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.416972 master-0 kubenswrapper[29612]: I0319 12:31:20.416950 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-config-data\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.417024 master-0 kubenswrapper[29612]: I0319 12:31:20.417007 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-scripts\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.417075 master-0 kubenswrapper[29612]: I0319 12:31:20.417042 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-scripts\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.419405 master-0 kubenswrapper[29612]: I0319 12:31:20.419366 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-combined-ca-bundle\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.419572 master-0 kubenswrapper[29612]: I0319 12:31:20.419543 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzw2\" (UniqueName: \"kubernetes.io/projected/2a53c820-34b9-4c07-a724-063e01459e1f-kube-api-access-6pzw2\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.430950 master-0 kubenswrapper[29612]: I0319 12:31:20.430896 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-scripts\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.431283 master-0 kubenswrapper[29612]: I0319 12:31:20.431231 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-combined-ca-bundle\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.431336 master-0 kubenswrapper[29612]: I0319 12:31:20.431277 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-config-data\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.431336 master-0 kubenswrapper[29612]: I0319 12:31:20.431297 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-scripts\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.431626 master-0 kubenswrapper[29612]: I0319 12:31:20.431569 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-config-data\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.431808 master-0 kubenswrapper[29612]: I0319 12:31:20.431776 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.436326 master-0 kubenswrapper[29612]: I0319 12:31:20.436271 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tmz\" (UniqueName: \"kubernetes.io/projected/261954af-1d89-4d6d-a269-7407af2c533c-kube-api-access-s7tmz\") pod \"nova-cell1-host-discover-w9c2p\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.442496 master-0 kubenswrapper[29612]: I0319 12:31:20.442312 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzw2\" (UniqueName: \"kubernetes.io/projected/2a53c820-34b9-4c07-a724-063e01459e1f-kube-api-access-6pzw2\") pod \"nova-cell1-cell-mapping-xd7zk\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.578928 master-0 kubenswrapper[29612]: I0319 12:31:20.578710 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:20.610074 master-0 kubenswrapper[29612]: I0319 12:31:20.606979 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:20.878359 master-0 kubenswrapper[29612]: I0319 12:31:20.878307 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:20.948625 master-0 kubenswrapper[29612]: I0319 12:31:20.946994 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-config-data\") pod \"4a27205b-345b-4df7-9b34-16cab1ac0665\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " Mar 19 12:31:20.948625 master-0 kubenswrapper[29612]: I0319 12:31:20.947257 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a27205b-345b-4df7-9b34-16cab1ac0665-logs\") pod \"4a27205b-345b-4df7-9b34-16cab1ac0665\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " Mar 19 12:31:20.948625 master-0 kubenswrapper[29612]: I0319 12:31:20.947336 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-combined-ca-bundle\") pod \"4a27205b-345b-4df7-9b34-16cab1ac0665\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " Mar 19 12:31:20.948625 master-0 kubenswrapper[29612]: I0319 12:31:20.947431 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6lbf\" (UniqueName: \"kubernetes.io/projected/4a27205b-345b-4df7-9b34-16cab1ac0665-kube-api-access-j6lbf\") pod \"4a27205b-345b-4df7-9b34-16cab1ac0665\" (UID: \"4a27205b-345b-4df7-9b34-16cab1ac0665\") " Mar 19 12:31:20.952252 master-0 kubenswrapper[29612]: I0319 12:31:20.952193 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a27205b-345b-4df7-9b34-16cab1ac0665-logs" (OuterVolumeSpecName: "logs") pod "4a27205b-345b-4df7-9b34-16cab1ac0665" (UID: "4a27205b-345b-4df7-9b34-16cab1ac0665"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:31:20.953393 master-0 kubenswrapper[29612]: I0319 12:31:20.953352 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a27205b-345b-4df7-9b34-16cab1ac0665-kube-api-access-j6lbf" (OuterVolumeSpecName: "kube-api-access-j6lbf") pod "4a27205b-345b-4df7-9b34-16cab1ac0665" (UID: "4a27205b-345b-4df7-9b34-16cab1ac0665"). InnerVolumeSpecName "kube-api-access-j6lbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:20.997468 master-0 kubenswrapper[29612]: I0319 12:31:20.997396 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a27205b-345b-4df7-9b34-16cab1ac0665" (UID: "4a27205b-345b-4df7-9b34-16cab1ac0665"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:21.002371 master-0 kubenswrapper[29612]: I0319 12:31:21.002322 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-config-data" (OuterVolumeSpecName: "config-data") pod "4a27205b-345b-4df7-9b34-16cab1ac0665" (UID: "4a27205b-345b-4df7-9b34-16cab1ac0665"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:21.037711 master-0 kubenswrapper[29612]: I0319 12:31:21.026806 29612 generic.go:334] "Generic (PLEG): container finished" podID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerID="c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66" exitCode=0 Mar 19 12:31:21.037711 master-0 kubenswrapper[29612]: I0319 12:31:21.027791 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:21.037711 master-0 kubenswrapper[29612]: I0319 12:31:21.028120 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a27205b-345b-4df7-9b34-16cab1ac0665","Type":"ContainerDied","Data":"c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66"} Mar 19 12:31:21.037711 master-0 kubenswrapper[29612]: I0319 12:31:21.028148 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a27205b-345b-4df7-9b34-16cab1ac0665","Type":"ContainerDied","Data":"1502246dff981f15ceba71f48dd51517bc7ce0e09f1300ea0928e83b8b67aa8b"} Mar 19 12:31:21.037711 master-0 kubenswrapper[29612]: I0319 12:31:21.028165 29612 scope.go:117] "RemoveContainer" containerID="c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66" Mar 19 12:31:21.050750 master-0 kubenswrapper[29612]: I0319 12:31:21.050361 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a27205b-345b-4df7-9b34-16cab1ac0665-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:21.050750 master-0 kubenswrapper[29612]: I0319 12:31:21.050410 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:21.050750 master-0 kubenswrapper[29612]: I0319 12:31:21.050421 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6lbf\" (UniqueName: \"kubernetes.io/projected/4a27205b-345b-4df7-9b34-16cab1ac0665-kube-api-access-j6lbf\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:21.050750 master-0 kubenswrapper[29612]: I0319 12:31:21.050432 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a27205b-345b-4df7-9b34-16cab1ac0665-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:21.082218 master-0 kubenswrapper[29612]: I0319 12:31:21.080700 29612 scope.go:117] "RemoveContainer" containerID="2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be" Mar 19 12:31:21.090938 master-0 kubenswrapper[29612]: I0319 12:31:21.090898 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:21.102808 master-0 kubenswrapper[29612]: I0319 12:31:21.102613 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: I0319 12:31:21.130941 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: E0319 12:31:21.131434 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-log" Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: I0319 12:31:21.131449 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-log" Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: E0319 12:31:21.131719 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-api" Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: I0319 12:31:21.131726 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-api" Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: I0319 12:31:21.131951 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-log" Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: I0319 12:31:21.131972 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" containerName="nova-api-api" Mar 19 12:31:21.152430 master-0 kubenswrapper[29612]: I0319 12:31:21.133621 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: I0319 12:31:21.163955 29612 scope.go:117] "RemoveContainer" containerID="c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: I0319 12:31:21.164603 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: I0319 12:31:21.164823 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: I0319 12:31:21.164923 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: E0319 12:31:21.165126 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66\": container with ID starting with c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66 not found: ID does not exist" containerID="c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: I0319 12:31:21.165171 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66"} err="failed to get container status \"c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66\": rpc error: code = NotFound desc = could not find container \"c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66\": container with ID starting with c0d3ad83a53fb0eb7e3c6d71dde22b9fdf24717061b68d4508b5deeaf69b2b66 not found: ID does not exist" Mar 19 12:31:21.165258 master-0 kubenswrapper[29612]: I0319 12:31:21.165200 29612 scope.go:117] "RemoveContainer" containerID="2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be" Mar 19 12:31:21.170221 master-0 kubenswrapper[29612]: E0319 12:31:21.170073 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be\": container with ID starting with 2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be not found: ID does not exist" containerID="2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be" Mar 19 12:31:21.170221 master-0 kubenswrapper[29612]: I0319 12:31:21.170138 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be"} err="failed to get container status \"2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be\": rpc error: code = NotFound desc = could not find container \"2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be\": container with ID starting with 2f0e5b1990202bb9e87bf6dbf70b2c0e286f4b043381fbe0c273e02d55f479be not found: ID does not exist" Mar 19 12:31:21.199384 master-0 kubenswrapper[29612]: I0319 12:31:21.196985 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:21.250104 master-0 kubenswrapper[29612]: I0319 12:31:21.250046 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xd7zk"] Mar 19 12:31:21.255789 master-0 kubenswrapper[29612]: I0319 12:31:21.255740 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.255892 master-0 kubenswrapper[29612]: I0319 12:31:21.255876 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq6rs\" (UniqueName: \"kubernetes.io/projected/f34f8258-4d21-4168-8cd4-93b13b88f069-kube-api-access-fq6rs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.255948 master-0 kubenswrapper[29612]: I0319 12:31:21.255927 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-config-data\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.255981 master-0 kubenswrapper[29612]: I0319 12:31:21.255953 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34f8258-4d21-4168-8cd4-93b13b88f069-logs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.255981 master-0 kubenswrapper[29612]: I0319 12:31:21.255974 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-public-tls-certs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.256050 master-0 kubenswrapper[29612]: I0319 12:31:21.256019 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.318673 master-0 kubenswrapper[29612]: I0319 12:31:21.315141 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-w9c2p"] Mar 19 12:31:21.358581 master-0 kubenswrapper[29612]: I0319 12:31:21.358459 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.358775 master-0 kubenswrapper[29612]: I0319 12:31:21.358591 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq6rs\" (UniqueName: \"kubernetes.io/projected/f34f8258-4d21-4168-8cd4-93b13b88f069-kube-api-access-fq6rs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.358775 master-0 kubenswrapper[29612]: I0319 12:31:21.358722 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-config-data\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.358992 master-0 kubenswrapper[29612]: I0319 12:31:21.358952 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34f8258-4d21-4168-8cd4-93b13b88f069-logs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.359100 master-0 kubenswrapper[29612]: I0319 12:31:21.359086 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-public-tls-certs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.359426 master-0 kubenswrapper[29612]: I0319 12:31:21.359385 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.359722 master-0 kubenswrapper[29612]: I0319 12:31:21.359649 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34f8258-4d21-4168-8cd4-93b13b88f069-logs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.362874 master-0 kubenswrapper[29612]: I0319 12:31:21.362845 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-public-tls-certs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.365494 master-0 kubenswrapper[29612]: I0319 12:31:21.365450 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-config-data\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.367107 master-0 kubenswrapper[29612]: I0319 12:31:21.367065 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.373212 master-0 kubenswrapper[29612]: I0319 12:31:21.373181 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.377817 master-0 kubenswrapper[29612]: I0319 12:31:21.377674 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq6rs\" (UniqueName: \"kubernetes.io/projected/f34f8258-4d21-4168-8cd4-93b13b88f069-kube-api-access-fq6rs\") pod \"nova-api-0\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " pod="openstack/nova-api-0" Mar 19 12:31:21.522484 master-0 kubenswrapper[29612]: I0319 12:31:21.522422 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:22.043269 master-0 kubenswrapper[29612]: I0319 12:31:22.043003 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xd7zk" event={"ID":"2a53c820-34b9-4c07-a724-063e01459e1f","Type":"ContainerStarted","Data":"e305787b78d3397b1225d76b87cdd972c30bd8e70f9f2cd96db58d319c106124"} Mar 19 12:31:22.043532 master-0 kubenswrapper[29612]: I0319 12:31:22.043280 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xd7zk" event={"ID":"2a53c820-34b9-4c07-a724-063e01459e1f","Type":"ContainerStarted","Data":"c5948989aba0c971a3072ccac3a0a05c4eeebac5da0f18f460f4c3fb9307d766"} Mar 19 12:31:22.045848 master-0 kubenswrapper[29612]: I0319 12:31:22.045783 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-w9c2p" event={"ID":"261954af-1d89-4d6d-a269-7407af2c533c","Type":"ContainerStarted","Data":"370ab7fbe3f8f512c9f251de300927d9e115ff24902f3e159de30997fdc00c91"} Mar 19 12:31:22.045848 master-0 kubenswrapper[29612]: I0319 12:31:22.045849 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-w9c2p" event={"ID":"261954af-1d89-4d6d-a269-7407af2c533c","Type":"ContainerStarted","Data":"a52ae9db310d23930f4ba5d4ba036031570aa356f69256735cc5222d37350d11"} Mar 19 12:31:22.182618 master-0 kubenswrapper[29612]: W0319 12:31:22.180779 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf34f8258_4d21_4168_8cd4_93b13b88f069.slice/crio-88339b842913c4350fcccda024558a4a6f43b3987a6d8fd285bfb9b1b241a05e WatchSource:0}: Error finding container 88339b842913c4350fcccda024558a4a6f43b3987a6d8fd285bfb9b1b241a05e: Status 404 returned error can't find the container with id 88339b842913c4350fcccda024558a4a6f43b3987a6d8fd285bfb9b1b241a05e Mar 19 12:31:22.196000 master-0 kubenswrapper[29612]: I0319 12:31:22.195947 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:22.214798 master-0 kubenswrapper[29612]: I0319 12:31:22.214301 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xd7zk" podStartSLOduration=2.214276942 podStartE2EDuration="2.214276942s" podCreationTimestamp="2026-03-19 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:22.206906932 +0000 UTC m=+1289.521075963" watchObservedRunningTime="2026-03-19 12:31:22.214276942 +0000 UTC m=+1289.528445963" Mar 19 12:31:22.248530 master-0 kubenswrapper[29612]: I0319 12:31:22.248417 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-w9c2p" podStartSLOduration=2.248397651 podStartE2EDuration="2.248397651s" podCreationTimestamp="2026-03-19 12:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:22.243329367 +0000 UTC m=+1289.557498428" watchObservedRunningTime="2026-03-19 12:31:22.248397651 +0000 UTC m=+1289.562566672" Mar 19 12:31:22.920597 master-0 kubenswrapper[29612]: I0319 12:31:22.920511 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a27205b-345b-4df7-9b34-16cab1ac0665" path="/var/lib/kubelet/pods/4a27205b-345b-4df7-9b34-16cab1ac0665/volumes" Mar 19 12:31:23.061404 master-0 kubenswrapper[29612]: I0319 12:31:23.061354 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34f8258-4d21-4168-8cd4-93b13b88f069","Type":"ContainerStarted","Data":"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2"} Mar 19 12:31:23.061404 master-0 kubenswrapper[29612]: I0319 12:31:23.061404 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34f8258-4d21-4168-8cd4-93b13b88f069","Type":"ContainerStarted","Data":"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9"} Mar 19 12:31:23.061404 master-0 kubenswrapper[29612]: I0319 12:31:23.061416 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34f8258-4d21-4168-8cd4-93b13b88f069","Type":"ContainerStarted","Data":"88339b842913c4350fcccda024558a4a6f43b3987a6d8fd285bfb9b1b241a05e"} Mar 19 12:31:23.092819 master-0 kubenswrapper[29612]: I0319 12:31:23.092727 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.092709654 podStartE2EDuration="2.092709654s" podCreationTimestamp="2026-03-19 12:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:23.082910086 +0000 UTC m=+1290.397079117" watchObservedRunningTime="2026-03-19 12:31:23.092709654 +0000 UTC m=+1290.406878685" Mar 19 12:31:24.623996 master-0 kubenswrapper[29612]: I0319 12:31:24.623923 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79c4b5fc8f-psc2c" Mar 19 12:31:24.774213 master-0 kubenswrapper[29612]: I0319 12:31:24.773712 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58cf5bfdcc-55c55"] Mar 19 12:31:24.774213 master-0 kubenswrapper[29612]: I0319 12:31:24.774100 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerName="dnsmasq-dns" containerID="cri-o://530b35a7818a248894bc17f8526f98e304a28482d0d84c9c4db2a2634e3cfc6b" gracePeriod=10 Mar 19 12:31:25.102803 master-0 kubenswrapper[29612]: I0319 12:31:25.102258 29612 generic.go:334] "Generic (PLEG): container finished" podID="261954af-1d89-4d6d-a269-7407af2c533c" containerID="370ab7fbe3f8f512c9f251de300927d9e115ff24902f3e159de30997fdc00c91" exitCode=0 Mar 19 12:31:25.102990 master-0 kubenswrapper[29612]: I0319 12:31:25.102807 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-w9c2p" event={"ID":"261954af-1d89-4d6d-a269-7407af2c533c","Type":"ContainerDied","Data":"370ab7fbe3f8f512c9f251de300927d9e115ff24902f3e159de30997fdc00c91"} Mar 19 12:31:25.115839 master-0 kubenswrapper[29612]: I0319 12:31:25.113133 29612 generic.go:334] "Generic (PLEG): container finished" podID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerID="530b35a7818a248894bc17f8526f98e304a28482d0d84c9c4db2a2634e3cfc6b" exitCode=0 Mar 19 12:31:25.115839 master-0 kubenswrapper[29612]: I0319 12:31:25.113187 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" event={"ID":"a986f3f0-42d2-4d2b-a898-03c5b9f30c92","Type":"ContainerDied","Data":"530b35a7818a248894bc17f8526f98e304a28482d0d84c9c4db2a2634e3cfc6b"} Mar 19 12:31:25.452874 master-0 kubenswrapper[29612]: I0319 12:31:25.452813 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:31:25.612325 master-0 kubenswrapper[29612]: I0319 12:31:25.612131 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-config\") pod \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " Mar 19 12:31:25.614343 master-0 kubenswrapper[29612]: I0319 12:31:25.612534 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlw27\" (UniqueName: \"kubernetes.io/projected/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-kube-api-access-mlw27\") pod \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " Mar 19 12:31:25.614343 master-0 kubenswrapper[29612]: I0319 12:31:25.612881 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-nb\") pod \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " Mar 19 12:31:25.614343 master-0 kubenswrapper[29612]: I0319 12:31:25.612945 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-sb\") pod \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " Mar 19 12:31:25.614343 master-0 kubenswrapper[29612]: I0319 12:31:25.612965 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-svc\") pod \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " Mar 19 12:31:25.614343 master-0 kubenswrapper[29612]: I0319 12:31:25.613013 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-swift-storage-0\") pod \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\" (UID: \"a986f3f0-42d2-4d2b-a898-03c5b9f30c92\") " Mar 19 12:31:25.622904 master-0 kubenswrapper[29612]: I0319 12:31:25.619901 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-kube-api-access-mlw27" (OuterVolumeSpecName: "kube-api-access-mlw27") pod "a986f3f0-42d2-4d2b-a898-03c5b9f30c92" (UID: "a986f3f0-42d2-4d2b-a898-03c5b9f30c92"). InnerVolumeSpecName "kube-api-access-mlw27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:25.729716 master-0 kubenswrapper[29612]: I0319 12:31:25.725847 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlw27\" (UniqueName: \"kubernetes.io/projected/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-kube-api-access-mlw27\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.033668 master-0 kubenswrapper[29612]: I0319 12:31:26.033601 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-config" (OuterVolumeSpecName: "config") pod "a986f3f0-42d2-4d2b-a898-03c5b9f30c92" (UID: "a986f3f0-42d2-4d2b-a898-03c5b9f30c92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:31:26.040627 master-0 kubenswrapper[29612]: I0319 12:31:26.040305 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a986f3f0-42d2-4d2b-a898-03c5b9f30c92" (UID: "a986f3f0-42d2-4d2b-a898-03c5b9f30c92"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:31:26.040862 master-0 kubenswrapper[29612]: I0319 12:31:26.040705 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a986f3f0-42d2-4d2b-a898-03c5b9f30c92" (UID: "a986f3f0-42d2-4d2b-a898-03c5b9f30c92"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:31:26.041249 master-0 kubenswrapper[29612]: I0319 12:31:26.041155 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a986f3f0-42d2-4d2b-a898-03c5b9f30c92" (UID: "a986f3f0-42d2-4d2b-a898-03c5b9f30c92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:31:26.044513 master-0 kubenswrapper[29612]: I0319 12:31:26.044479 29612 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.044690 master-0 kubenswrapper[29612]: I0319 12:31:26.044523 29612 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.044690 master-0 kubenswrapper[29612]: I0319 12:31:26.044542 29612 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.044690 master-0 kubenswrapper[29612]: I0319 12:31:26.044561 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.101620 master-0 kubenswrapper[29612]: I0319 12:31:26.101540 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a986f3f0-42d2-4d2b-a898-03c5b9f30c92" (UID: "a986f3f0-42d2-4d2b-a898-03c5b9f30c92"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:31:26.132761 master-0 kubenswrapper[29612]: I0319 12:31:26.130988 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" event={"ID":"a986f3f0-42d2-4d2b-a898-03c5b9f30c92","Type":"ContainerDied","Data":"149d48310e87f471e2ddc91a31d86f96be0d49019ef581b091f349091dc5ab41"} Mar 19 12:31:26.132761 master-0 kubenswrapper[29612]: I0319 12:31:26.131064 29612 scope.go:117] "RemoveContainer" containerID="530b35a7818a248894bc17f8526f98e304a28482d0d84c9c4db2a2634e3cfc6b" Mar 19 12:31:26.132761 master-0 kubenswrapper[29612]: I0319 12:31:26.131233 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cf5bfdcc-55c55" Mar 19 12:31:26.153397 master-0 kubenswrapper[29612]: I0319 12:31:26.153116 29612 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a986f3f0-42d2-4d2b-a898-03c5b9f30c92-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.186429 master-0 kubenswrapper[29612]: I0319 12:31:26.185076 29612 scope.go:117] "RemoveContainer" containerID="9fc1ef0a5cb2601acf3fb337bbaa5ec7875121dd63ec86a1dc5678e0992fa751" Mar 19 12:31:26.195572 master-0 kubenswrapper[29612]: I0319 12:31:26.195140 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58cf5bfdcc-55c55"] Mar 19 12:31:26.210659 master-0 kubenswrapper[29612]: I0319 12:31:26.208835 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58cf5bfdcc-55c55"] Mar 19 12:31:26.588830 master-0 kubenswrapper[29612]: I0319 12:31:26.588792 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:26.767958 master-0 kubenswrapper[29612]: I0319 12:31:26.767696 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tmz\" (UniqueName: \"kubernetes.io/projected/261954af-1d89-4d6d-a269-7407af2c533c-kube-api-access-s7tmz\") pod \"261954af-1d89-4d6d-a269-7407af2c533c\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " Mar 19 12:31:26.767958 master-0 kubenswrapper[29612]: I0319 12:31:26.767813 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-combined-ca-bundle\") pod \"261954af-1d89-4d6d-a269-7407af2c533c\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " Mar 19 12:31:26.767958 master-0 kubenswrapper[29612]: I0319 12:31:26.767878 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-config-data\") pod \"261954af-1d89-4d6d-a269-7407af2c533c\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " Mar 19 12:31:26.767958 master-0 kubenswrapper[29612]: I0319 12:31:26.767901 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-scripts\") pod \"261954af-1d89-4d6d-a269-7407af2c533c\" (UID: \"261954af-1d89-4d6d-a269-7407af2c533c\") " Mar 19 12:31:26.787395 master-0 kubenswrapper[29612]: I0319 12:31:26.787318 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/261954af-1d89-4d6d-a269-7407af2c533c-kube-api-access-s7tmz" (OuterVolumeSpecName: "kube-api-access-s7tmz") pod "261954af-1d89-4d6d-a269-7407af2c533c" (UID: "261954af-1d89-4d6d-a269-7407af2c533c"). InnerVolumeSpecName "kube-api-access-s7tmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:26.791388 master-0 kubenswrapper[29612]: I0319 12:31:26.789119 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-scripts" (OuterVolumeSpecName: "scripts") pod "261954af-1d89-4d6d-a269-7407af2c533c" (UID: "261954af-1d89-4d6d-a269-7407af2c533c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:26.805264 master-0 kubenswrapper[29612]: I0319 12:31:26.805197 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "261954af-1d89-4d6d-a269-7407af2c533c" (UID: "261954af-1d89-4d6d-a269-7407af2c533c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:26.809012 master-0 kubenswrapper[29612]: I0319 12:31:26.808958 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-config-data" (OuterVolumeSpecName: "config-data") pod "261954af-1d89-4d6d-a269-7407af2c533c" (UID: "261954af-1d89-4d6d-a269-7407af2c533c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:26.871183 master-0 kubenswrapper[29612]: I0319 12:31:26.871080 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tmz\" (UniqueName: \"kubernetes.io/projected/261954af-1d89-4d6d-a269-7407af2c533c-kube-api-access-s7tmz\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.871183 master-0 kubenswrapper[29612]: I0319 12:31:26.871131 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.871183 master-0 kubenswrapper[29612]: I0319 12:31:26.871141 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.871183 master-0 kubenswrapper[29612]: I0319 12:31:26.871150 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/261954af-1d89-4d6d-a269-7407af2c533c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:26.918702 master-0 kubenswrapper[29612]: I0319 12:31:26.918653 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" path="/var/lib/kubelet/pods/a986f3f0-42d2-4d2b-a898-03c5b9f30c92/volumes" Mar 19 12:31:27.145796 master-0 kubenswrapper[29612]: I0319 12:31:27.145738 29612 generic.go:334] "Generic (PLEG): container finished" podID="2a53c820-34b9-4c07-a724-063e01459e1f" containerID="e305787b78d3397b1225d76b87cdd972c30bd8e70f9f2cd96db58d319c106124" exitCode=0 Mar 19 12:31:27.146110 master-0 kubenswrapper[29612]: I0319 12:31:27.145881 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xd7zk" event={"ID":"2a53c820-34b9-4c07-a724-063e01459e1f","Type":"ContainerDied","Data":"e305787b78d3397b1225d76b87cdd972c30bd8e70f9f2cd96db58d319c106124"} Mar 19 12:31:27.148625 master-0 kubenswrapper[29612]: I0319 12:31:27.148258 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-w9c2p" event={"ID":"261954af-1d89-4d6d-a269-7407af2c533c","Type":"ContainerDied","Data":"a52ae9db310d23930f4ba5d4ba036031570aa356f69256735cc5222d37350d11"} Mar 19 12:31:27.148625 master-0 kubenswrapper[29612]: I0319 12:31:27.148297 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52ae9db310d23930f4ba5d4ba036031570aa356f69256735cc5222d37350d11" Mar 19 12:31:27.148625 master-0 kubenswrapper[29612]: I0319 12:31:27.148270 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-w9c2p" Mar 19 12:31:27.394046 master-0 kubenswrapper[29612]: I0319 12:31:27.393985 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 12:31:27.395656 master-0 kubenswrapper[29612]: I0319 12:31:27.395592 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 12:31:28.754506 master-0 kubenswrapper[29612]: I0319 12:31:28.754445 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:28.858074 master-0 kubenswrapper[29612]: I0319 12:31:28.858041 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-config-data\") pod \"2a53c820-34b9-4c07-a724-063e01459e1f\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " Mar 19 12:31:28.858200 master-0 kubenswrapper[29612]: I0319 12:31:28.858185 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-scripts\") pod \"2a53c820-34b9-4c07-a724-063e01459e1f\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " Mar 19 12:31:28.858256 master-0 kubenswrapper[29612]: I0319 12:31:28.858247 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pzw2\" (UniqueName: \"kubernetes.io/projected/2a53c820-34b9-4c07-a724-063e01459e1f-kube-api-access-6pzw2\") pod \"2a53c820-34b9-4c07-a724-063e01459e1f\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " Mar 19 12:31:28.858441 master-0 kubenswrapper[29612]: I0319 12:31:28.858427 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-combined-ca-bundle\") pod \"2a53c820-34b9-4c07-a724-063e01459e1f\" (UID: \"2a53c820-34b9-4c07-a724-063e01459e1f\") " Mar 19 12:31:28.862026 master-0 kubenswrapper[29612]: I0319 12:31:28.861972 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-scripts" (OuterVolumeSpecName: "scripts") pod "2a53c820-34b9-4c07-a724-063e01459e1f" (UID: "2a53c820-34b9-4c07-a724-063e01459e1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:28.863016 master-0 kubenswrapper[29612]: I0319 12:31:28.862952 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a53c820-34b9-4c07-a724-063e01459e1f-kube-api-access-6pzw2" (OuterVolumeSpecName: "kube-api-access-6pzw2") pod "2a53c820-34b9-4c07-a724-063e01459e1f" (UID: "2a53c820-34b9-4c07-a724-063e01459e1f"). InnerVolumeSpecName "kube-api-access-6pzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:28.888181 master-0 kubenswrapper[29612]: I0319 12:31:28.888127 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-config-data" (OuterVolumeSpecName: "config-data") pod "2a53c820-34b9-4c07-a724-063e01459e1f" (UID: "2a53c820-34b9-4c07-a724-063e01459e1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:28.892597 master-0 kubenswrapper[29612]: I0319 12:31:28.892511 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a53c820-34b9-4c07-a724-063e01459e1f" (UID: "2a53c820-34b9-4c07-a724-063e01459e1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:28.962085 master-0 kubenswrapper[29612]: I0319 12:31:28.962015 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:28.962085 master-0 kubenswrapper[29612]: I0319 12:31:28.962067 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:28.962085 master-0 kubenswrapper[29612]: I0319 12:31:28.962082 29612 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a53c820-34b9-4c07-a724-063e01459e1f-scripts\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:28.962085 master-0 kubenswrapper[29612]: I0319 12:31:28.962096 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pzw2\" (UniqueName: \"kubernetes.io/projected/2a53c820-34b9-4c07-a724-063e01459e1f-kube-api-access-6pzw2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:29.174096 master-0 kubenswrapper[29612]: I0319 12:31:29.174026 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xd7zk" event={"ID":"2a53c820-34b9-4c07-a724-063e01459e1f","Type":"ContainerDied","Data":"c5948989aba0c971a3072ccac3a0a05c4eeebac5da0f18f460f4c3fb9307d766"} Mar 19 12:31:29.174330 master-0 kubenswrapper[29612]: I0319 12:31:29.174309 29612 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5948989aba0c971a3072ccac3a0a05c4eeebac5da0f18f460f4c3fb9307d766" Mar 19 12:31:29.174447 master-0 kubenswrapper[29612]: I0319 12:31:29.174104 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xd7zk" Mar 19 12:31:29.391614 master-0 kubenswrapper[29612]: I0319 12:31:29.391566 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:29.392169 master-0 kubenswrapper[29612]: I0319 12:31:29.392135 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-log" containerID="cri-o://5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9" gracePeriod=30 Mar 19 12:31:29.392411 master-0 kubenswrapper[29612]: I0319 12:31:29.392155 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-api" containerID="cri-o://3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2" gracePeriod=30 Mar 19 12:31:29.412110 master-0 kubenswrapper[29612]: I0319 12:31:29.412056 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 12:31:29.427771 master-0 kubenswrapper[29612]: I0319 12:31:29.427324 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:31:29.427771 master-0 kubenswrapper[29612]: I0319 12:31:29.427590 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="24df43e9-37cb-4632-9a19-5b602c2742f9" containerName="nova-scheduler-scheduler" containerID="cri-o://3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" gracePeriod=30 Mar 19 12:31:29.434321 master-0 kubenswrapper[29612]: I0319 12:31:29.434266 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 12:31:29.438734 master-0 kubenswrapper[29612]: I0319 12:31:29.438623 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 12:31:29.460758 master-0 kubenswrapper[29612]: I0319 12:31:29.460116 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:29.983688 master-0 kubenswrapper[29612]: I0319 12:31:29.983060 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:30.105202 master-0 kubenswrapper[29612]: I0319 12:31:30.105137 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-public-tls-certs\") pod \"f34f8258-4d21-4168-8cd4-93b13b88f069\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " Mar 19 12:31:30.105202 master-0 kubenswrapper[29612]: I0319 12:31:30.105211 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-combined-ca-bundle\") pod \"f34f8258-4d21-4168-8cd4-93b13b88f069\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " Mar 19 12:31:30.105541 master-0 kubenswrapper[29612]: I0319 12:31:30.105307 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-config-data\") pod \"f34f8258-4d21-4168-8cd4-93b13b88f069\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " Mar 19 12:31:30.105541 master-0 kubenswrapper[29612]: I0319 12:31:30.105426 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-internal-tls-certs\") pod \"f34f8258-4d21-4168-8cd4-93b13b88f069\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " Mar 19 12:31:30.105541 master-0 kubenswrapper[29612]: I0319 12:31:30.105470 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34f8258-4d21-4168-8cd4-93b13b88f069-logs\") pod \"f34f8258-4d21-4168-8cd4-93b13b88f069\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " Mar 19 12:31:30.105541 master-0 kubenswrapper[29612]: I0319 12:31:30.105501 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq6rs\" (UniqueName: \"kubernetes.io/projected/f34f8258-4d21-4168-8cd4-93b13b88f069-kube-api-access-fq6rs\") pod \"f34f8258-4d21-4168-8cd4-93b13b88f069\" (UID: \"f34f8258-4d21-4168-8cd4-93b13b88f069\") " Mar 19 12:31:30.105940 master-0 kubenswrapper[29612]: I0319 12:31:30.105884 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34f8258-4d21-4168-8cd4-93b13b88f069-logs" (OuterVolumeSpecName: "logs") pod "f34f8258-4d21-4168-8cd4-93b13b88f069" (UID: "f34f8258-4d21-4168-8cd4-93b13b88f069"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:31:30.106609 master-0 kubenswrapper[29612]: I0319 12:31:30.106577 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34f8258-4d21-4168-8cd4-93b13b88f069-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:30.109950 master-0 kubenswrapper[29612]: I0319 12:31:30.109868 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34f8258-4d21-4168-8cd4-93b13b88f069-kube-api-access-fq6rs" (OuterVolumeSpecName: "kube-api-access-fq6rs") pod "f34f8258-4d21-4168-8cd4-93b13b88f069" (UID: "f34f8258-4d21-4168-8cd4-93b13b88f069"). InnerVolumeSpecName "kube-api-access-fq6rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:30.134299 master-0 kubenswrapper[29612]: I0319 12:31:30.134239 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f34f8258-4d21-4168-8cd4-93b13b88f069" (UID: "f34f8258-4d21-4168-8cd4-93b13b88f069"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:30.154898 master-0 kubenswrapper[29612]: I0319 12:31:30.154841 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-config-data" (OuterVolumeSpecName: "config-data") pod "f34f8258-4d21-4168-8cd4-93b13b88f069" (UID: "f34f8258-4d21-4168-8cd4-93b13b88f069"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:30.171391 master-0 kubenswrapper[29612]: I0319 12:31:30.171338 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f34f8258-4d21-4168-8cd4-93b13b88f069" (UID: "f34f8258-4d21-4168-8cd4-93b13b88f069"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:30.174511 master-0 kubenswrapper[29612]: I0319 12:31:30.174443 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f34f8258-4d21-4168-8cd4-93b13b88f069" (UID: "f34f8258-4d21-4168-8cd4-93b13b88f069"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:30.198743 master-0 kubenswrapper[29612]: I0319 12:31:30.198696 29612 generic.go:334] "Generic (PLEG): container finished" podID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerID="3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2" exitCode=0 Mar 19 12:31:30.200031 master-0 kubenswrapper[29612]: I0319 12:31:30.198975 29612 generic.go:334] "Generic (PLEG): container finished" podID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerID="5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9" exitCode=143 Mar 19 12:31:30.200031 master-0 kubenswrapper[29612]: I0319 12:31:30.199396 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34f8258-4d21-4168-8cd4-93b13b88f069","Type":"ContainerDied","Data":"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2"} Mar 19 12:31:30.200031 master-0 kubenswrapper[29612]: I0319 12:31:30.199595 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34f8258-4d21-4168-8cd4-93b13b88f069","Type":"ContainerDied","Data":"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9"} Mar 19 12:31:30.200031 master-0 kubenswrapper[29612]: I0319 12:31:30.199614 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f34f8258-4d21-4168-8cd4-93b13b88f069","Type":"ContainerDied","Data":"88339b842913c4350fcccda024558a4a6f43b3987a6d8fd285bfb9b1b241a05e"} Mar 19 12:31:30.200031 master-0 kubenswrapper[29612]: I0319 12:31:30.199652 29612 scope.go:117] "RemoveContainer" containerID="3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2" Mar 19 12:31:30.201276 master-0 kubenswrapper[29612]: I0319 12:31:30.201241 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:30.207673 master-0 kubenswrapper[29612]: I0319 12:31:30.207624 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 12:31:30.208400 master-0 kubenswrapper[29612]: I0319 12:31:30.208346 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:30.208400 master-0 kubenswrapper[29612]: I0319 12:31:30.208385 29612 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:30.208400 master-0 kubenswrapper[29612]: I0319 12:31:30.208396 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq6rs\" (UniqueName: \"kubernetes.io/projected/f34f8258-4d21-4168-8cd4-93b13b88f069-kube-api-access-fq6rs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:30.208400 master-0 kubenswrapper[29612]: I0319 12:31:30.208406 29612 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:30.208818 master-0 kubenswrapper[29612]: I0319 12:31:30.208415 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34f8258-4d21-4168-8cd4-93b13b88f069-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:30.312459 master-0 kubenswrapper[29612]: I0319 12:31:30.299996 29612 scope.go:117] "RemoveContainer" containerID="5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9" Mar 19 12:31:30.359900 master-0 kubenswrapper[29612]: I0319 12:31:30.359439 29612 scope.go:117] "RemoveContainer" containerID="3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2" Mar 19 12:31:30.360350 master-0 kubenswrapper[29612]: E0319 12:31:30.360097 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2\": container with ID starting with 3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2 not found: ID does not exist" containerID="3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2" Mar 19 12:31:30.360350 master-0 kubenswrapper[29612]: I0319 12:31:30.360151 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2"} err="failed to get container status \"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2\": rpc error: code = NotFound desc = could not find container \"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2\": container with ID starting with 3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2 not found: ID does not exist" Mar 19 12:31:30.360350 master-0 kubenswrapper[29612]: I0319 12:31:30.360180 29612 scope.go:117] "RemoveContainer" containerID="5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9" Mar 19 12:31:30.360530 master-0 kubenswrapper[29612]: E0319 12:31:30.360476 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9\": container with ID starting with 5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9 not found: ID does not exist" containerID="5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9" Mar 19 12:31:30.360530 master-0 kubenswrapper[29612]: I0319 12:31:30.360500 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9"} err="failed to get container status \"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9\": rpc error: code = NotFound desc = could not find container \"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9\": container with ID starting with 5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9 not found: ID does not exist" Mar 19 12:31:30.360530 master-0 kubenswrapper[29612]: I0319 12:31:30.360515 29612 scope.go:117] "RemoveContainer" containerID="3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2" Mar 19 12:31:30.361041 master-0 kubenswrapper[29612]: I0319 12:31:30.360898 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2"} err="failed to get container status \"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2\": rpc error: code = NotFound desc = could not find container \"3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2\": container with ID starting with 3499dd106ea049bb9f5552d37dd711a88f41a23590419a77b97057f49d32c5f2 not found: ID does not exist" Mar 19 12:31:30.361041 master-0 kubenswrapper[29612]: I0319 12:31:30.360942 29612 scope.go:117] "RemoveContainer" containerID="5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9" Mar 19 12:31:30.361418 master-0 kubenswrapper[29612]: I0319 12:31:30.361386 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9"} err="failed to get container status \"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9\": rpc error: code = NotFound desc = could not find container \"5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9\": container with ID starting with 5cf4439df096dd93647fdce0755ffb83e68eb0dcdecf04e2b8d551c5b138caa9 not found: ID does not exist" Mar 19 12:31:30.364712 master-0 kubenswrapper[29612]: I0319 12:31:30.363674 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:30.389205 master-0 kubenswrapper[29612]: I0319 12:31:30.389105 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:30.398802 master-0 kubenswrapper[29612]: I0319 12:31:30.398749 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:30.399516 master-0 kubenswrapper[29612]: E0319 12:31:30.399484 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="261954af-1d89-4d6d-a269-7407af2c533c" containerName="nova-manage" Mar 19 12:31:30.399566 master-0 kubenswrapper[29612]: I0319 12:31:30.399527 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="261954af-1d89-4d6d-a269-7407af2c533c" containerName="nova-manage" Mar 19 12:31:30.399566 master-0 kubenswrapper[29612]: E0319 12:31:30.399553 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-log" Mar 19 12:31:30.399566 master-0 kubenswrapper[29612]: I0319 12:31:30.399560 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-log" Mar 19 12:31:30.399674 master-0 kubenswrapper[29612]: E0319 12:31:30.399581 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerName="dnsmasq-dns" Mar 19 12:31:30.399674 master-0 kubenswrapper[29612]: I0319 12:31:30.399604 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerName="dnsmasq-dns" Mar 19 12:31:30.399674 master-0 kubenswrapper[29612]: E0319 12:31:30.399622 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerName="init" Mar 19 12:31:30.399674 master-0 kubenswrapper[29612]: I0319 12:31:30.399628 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerName="init" Mar 19 12:31:30.399839 master-0 kubenswrapper[29612]: E0319 12:31:30.399680 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-api" Mar 19 12:31:30.399839 master-0 kubenswrapper[29612]: I0319 12:31:30.399689 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-api" Mar 19 12:31:30.399839 master-0 kubenswrapper[29612]: E0319 12:31:30.399704 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a53c820-34b9-4c07-a724-063e01459e1f" containerName="nova-manage" Mar 19 12:31:30.399839 master-0 kubenswrapper[29612]: I0319 12:31:30.399710 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a53c820-34b9-4c07-a724-063e01459e1f" containerName="nova-manage" Mar 19 12:31:30.400195 master-0 kubenswrapper[29612]: I0319 12:31:30.400161 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="a986f3f0-42d2-4d2b-a898-03c5b9f30c92" containerName="dnsmasq-dns" Mar 19 12:31:30.400234 master-0 kubenswrapper[29612]: I0319 12:31:30.400210 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-log" Mar 19 12:31:30.400268 master-0 kubenswrapper[29612]: I0319 12:31:30.400246 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" containerName="nova-api-api" Mar 19 12:31:30.400268 master-0 kubenswrapper[29612]: I0319 12:31:30.400262 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="261954af-1d89-4d6d-a269-7407af2c533c" containerName="nova-manage" Mar 19 12:31:30.400331 master-0 kubenswrapper[29612]: I0319 12:31:30.400288 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a53c820-34b9-4c07-a724-063e01459e1f" containerName="nova-manage" Mar 19 12:31:30.402122 master-0 kubenswrapper[29612]: I0319 12:31:30.402098 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:30.404916 master-0 kubenswrapper[29612]: I0319 12:31:30.404376 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 12:31:30.404916 master-0 kubenswrapper[29612]: I0319 12:31:30.404410 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 12:31:30.404916 master-0 kubenswrapper[29612]: I0319 12:31:30.404451 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 12:31:30.413521 master-0 kubenswrapper[29612]: I0319 12:31:30.413469 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:30.519053 master-0 kubenswrapper[29612]: I0319 12:31:30.518996 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.519281 master-0 kubenswrapper[29612]: I0319 12:31:30.519133 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-public-tls-certs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.519281 master-0 kubenswrapper[29612]: I0319 12:31:30.519174 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-config-data\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.519281 master-0 kubenswrapper[29612]: I0319 12:31:30.519202 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.519545 master-0 kubenswrapper[29612]: I0319 12:31:30.519304 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnlfp\" (UniqueName: \"kubernetes.io/projected/5596ed6f-50cb-4225-84d3-cde5b2a59f43-kube-api-access-wnlfp\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.519545 master-0 kubenswrapper[29612]: I0319 12:31:30.519339 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596ed6f-50cb-4225-84d3-cde5b2a59f43-logs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.621342 master-0 kubenswrapper[29612]: I0319 12:31:30.621226 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-config-data\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.621342 master-0 kubenswrapper[29612]: I0319 12:31:30.621296 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.621555 master-0 kubenswrapper[29612]: I0319 12:31:30.621437 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnlfp\" (UniqueName: \"kubernetes.io/projected/5596ed6f-50cb-4225-84d3-cde5b2a59f43-kube-api-access-wnlfp\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.621555 master-0 kubenswrapper[29612]: I0319 12:31:30.621481 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596ed6f-50cb-4225-84d3-cde5b2a59f43-logs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.621685 master-0 kubenswrapper[29612]: I0319 12:31:30.621659 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.621807 master-0 kubenswrapper[29612]: I0319 12:31:30.621781 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-public-tls-certs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.622788 master-0 kubenswrapper[29612]: I0319 12:31:30.622732 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5596ed6f-50cb-4225-84d3-cde5b2a59f43-logs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.629862 master-0 kubenswrapper[29612]: I0319 12:31:30.628449 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.629862 master-0 kubenswrapper[29612]: I0319 12:31:30.629462 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-public-tls-certs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.630291 master-0 kubenswrapper[29612]: I0319 12:31:30.629876 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.636319 master-0 kubenswrapper[29612]: I0319 12:31:30.636280 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5596ed6f-50cb-4225-84d3-cde5b2a59f43-config-data\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.642760 master-0 kubenswrapper[29612]: I0319 12:31:30.642712 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnlfp\" (UniqueName: \"kubernetes.io/projected/5596ed6f-50cb-4225-84d3-cde5b2a59f43-kube-api-access-wnlfp\") pod \"nova-api-0\" (UID: \"5596ed6f-50cb-4225-84d3-cde5b2a59f43\") " pod="openstack/nova-api-0" Mar 19 12:31:30.730816 master-0 kubenswrapper[29612]: I0319 12:31:30.730764 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 12:31:30.949694 master-0 kubenswrapper[29612]: I0319 12:31:30.948958 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34f8258-4d21-4168-8cd4-93b13b88f069" path="/var/lib/kubelet/pods/f34f8258-4d21-4168-8cd4-93b13b88f069/volumes" Mar 19 12:31:31.018820 master-0 kubenswrapper[29612]: E0319 12:31:31.018752 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 12:31:31.021367 master-0 kubenswrapper[29612]: E0319 12:31:31.021290 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 12:31:31.023738 master-0 kubenswrapper[29612]: E0319 12:31:31.023654 29612 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 12:31:31.023738 master-0 kubenswrapper[29612]: E0319 12:31:31.023716 29612 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="24df43e9-37cb-4632-9a19-5b602c2742f9" containerName="nova-scheduler-scheduler" Mar 19 12:31:31.204293 master-0 kubenswrapper[29612]: I0319 12:31:31.204164 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 12:31:31.220620 master-0 kubenswrapper[29612]: I0319 12:31:31.220557 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596ed6f-50cb-4225-84d3-cde5b2a59f43","Type":"ContainerStarted","Data":"9e56ff254266f5d508ab627041f9004be92da7eb6e74c97536fbd430e6e3045e"} Mar 19 12:31:31.222788 master-0 kubenswrapper[29612]: I0319 12:31:31.222741 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-log" containerID="cri-o://6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703" gracePeriod=30 Mar 19 12:31:31.222947 master-0 kubenswrapper[29612]: I0319 12:31:31.222861 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-metadata" containerID="cri-o://adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d" gracePeriod=30 Mar 19 12:31:32.244272 master-0 kubenswrapper[29612]: I0319 12:31:32.244193 29612 generic.go:334] "Generic (PLEG): container finished" podID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerID="6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703" exitCode=143 Mar 19 12:31:32.245251 master-0 kubenswrapper[29612]: I0319 12:31:32.244312 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef802a6a-c865-4c30-935a-d8c7f9c85e2d","Type":"ContainerDied","Data":"6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703"} Mar 19 12:31:32.247306 master-0 kubenswrapper[29612]: I0319 12:31:32.247254 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596ed6f-50cb-4225-84d3-cde5b2a59f43","Type":"ContainerStarted","Data":"3542c07149ca8b4b527bdada6e01553efc0d44fc80fd7fc8f86513c4ff791b90"} Mar 19 12:31:32.247306 master-0 kubenswrapper[29612]: I0319 12:31:32.247297 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5596ed6f-50cb-4225-84d3-cde5b2a59f43","Type":"ContainerStarted","Data":"6deeac62e8e06f658301212838d8703af635a32508609097ec296a30d88476b3"} Mar 19 12:31:32.296250 master-0 kubenswrapper[29612]: I0319 12:31:32.296147 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.296127334 podStartE2EDuration="2.296127334s" podCreationTimestamp="2026-03-19 12:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:32.274858 +0000 UTC m=+1299.589027031" watchObservedRunningTime="2026-03-19 12:31:32.296127334 +0000 UTC m=+1299.610296365" Mar 19 12:31:34.909810 master-0 kubenswrapper[29612]: I0319 12:31:34.905570 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:35.082458 master-0 kubenswrapper[29612]: I0319 12:31:35.082394 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bgq2\" (UniqueName: \"kubernetes.io/projected/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-kube-api-access-7bgq2\") pod \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " Mar 19 12:31:35.082677 master-0 kubenswrapper[29612]: I0319 12:31:35.082565 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-combined-ca-bundle\") pod \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " Mar 19 12:31:35.083244 master-0 kubenswrapper[29612]: I0319 12:31:35.082708 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-logs\") pod \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " Mar 19 12:31:35.083359 master-0 kubenswrapper[29612]: I0319 12:31:35.083202 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-logs" (OuterVolumeSpecName: "logs") pod "ef802a6a-c865-4c30-935a-d8c7f9c85e2d" (UID: "ef802a6a-c865-4c30-935a-d8c7f9c85e2d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 12:31:35.083432 master-0 kubenswrapper[29612]: I0319 12:31:35.083411 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-nova-metadata-tls-certs\") pod \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " Mar 19 12:31:35.083526 master-0 kubenswrapper[29612]: I0319 12:31:35.083484 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-config-data\") pod \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\" (UID: \"ef802a6a-c865-4c30-935a-d8c7f9c85e2d\") " Mar 19 12:31:35.085122 master-0 kubenswrapper[29612]: I0319 12:31:35.085072 29612 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-logs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.087090 master-0 kubenswrapper[29612]: I0319 12:31:35.086995 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-kube-api-access-7bgq2" (OuterVolumeSpecName: "kube-api-access-7bgq2") pod "ef802a6a-c865-4c30-935a-d8c7f9c85e2d" (UID: "ef802a6a-c865-4c30-935a-d8c7f9c85e2d"). InnerVolumeSpecName "kube-api-access-7bgq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:35.120430 master-0 kubenswrapper[29612]: I0319 12:31:35.120365 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef802a6a-c865-4c30-935a-d8c7f9c85e2d" (UID: "ef802a6a-c865-4c30-935a-d8c7f9c85e2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:35.127125 master-0 kubenswrapper[29612]: I0319 12:31:35.127061 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-config-data" (OuterVolumeSpecName: "config-data") pod "ef802a6a-c865-4c30-935a-d8c7f9c85e2d" (UID: "ef802a6a-c865-4c30-935a-d8c7f9c85e2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:35.148088 master-0 kubenswrapper[29612]: I0319 12:31:35.148036 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ef802a6a-c865-4c30-935a-d8c7f9c85e2d" (UID: "ef802a6a-c865-4c30-935a-d8c7f9c85e2d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:35.186796 master-0 kubenswrapper[29612]: I0319 12:31:35.186727 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bgq2\" (UniqueName: \"kubernetes.io/projected/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-kube-api-access-7bgq2\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.186796 master-0 kubenswrapper[29612]: I0319 12:31:35.186784 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.186796 master-0 kubenswrapper[29612]: I0319 12:31:35.186796 29612 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.187087 master-0 kubenswrapper[29612]: I0319 12:31:35.186808 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef802a6a-c865-4c30-935a-d8c7f9c85e2d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.216086 master-0 kubenswrapper[29612]: I0319 12:31:35.216011 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:31:35.304938 master-0 kubenswrapper[29612]: I0319 12:31:35.304788 29612 generic.go:334] "Generic (PLEG): container finished" podID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerID="adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d" exitCode=0 Mar 19 12:31:35.304938 master-0 kubenswrapper[29612]: I0319 12:31:35.304859 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:35.304938 master-0 kubenswrapper[29612]: I0319 12:31:35.304906 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef802a6a-c865-4c30-935a-d8c7f9c85e2d","Type":"ContainerDied","Data":"adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d"} Mar 19 12:31:35.305191 master-0 kubenswrapper[29612]: I0319 12:31:35.304977 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ef802a6a-c865-4c30-935a-d8c7f9c85e2d","Type":"ContainerDied","Data":"e010e3c1aca27ca954d8a0264f18882c4ff23af6297bfcd73228d02041ce39d6"} Mar 19 12:31:35.305191 master-0 kubenswrapper[29612]: I0319 12:31:35.305024 29612 scope.go:117] "RemoveContainer" containerID="adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d" Mar 19 12:31:35.307264 master-0 kubenswrapper[29612]: I0319 12:31:35.307097 29612 generic.go:334] "Generic (PLEG): container finished" podID="24df43e9-37cb-4632-9a19-5b602c2742f9" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" exitCode=0 Mar 19 12:31:35.307264 master-0 kubenswrapper[29612]: I0319 12:31:35.307146 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"24df43e9-37cb-4632-9a19-5b602c2742f9","Type":"ContainerDied","Data":"3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057"} Mar 19 12:31:35.307264 master-0 kubenswrapper[29612]: I0319 12:31:35.307175 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"24df43e9-37cb-4632-9a19-5b602c2742f9","Type":"ContainerDied","Data":"d62422ca53036bdda1ec3d5434189c53bd65ef1e6a088cb87aa77c24ed320ec8"} Mar 19 12:31:35.307264 master-0 kubenswrapper[29612]: I0319 12:31:35.307211 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:31:35.331145 master-0 kubenswrapper[29612]: I0319 12:31:35.331108 29612 scope.go:117] "RemoveContainer" containerID="6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703" Mar 19 12:31:35.358978 master-0 kubenswrapper[29612]: I0319 12:31:35.358918 29612 scope.go:117] "RemoveContainer" containerID="adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d" Mar 19 12:31:35.359709 master-0 kubenswrapper[29612]: I0319 12:31:35.359582 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:35.360908 master-0 kubenswrapper[29612]: E0319 12:31:35.360861 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d\": container with ID starting with adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d not found: ID does not exist" containerID="adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d" Mar 19 12:31:35.361007 master-0 kubenswrapper[29612]: I0319 12:31:35.360904 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d"} err="failed to get container status \"adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d\": rpc error: code = NotFound desc = could not find container \"adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d\": container with ID starting with adacd5f288354564d21e6904ed3ab1c37ec74b0e2c14bfd339e4b381ca00a39d not found: ID does not exist" Mar 19 12:31:35.361007 master-0 kubenswrapper[29612]: I0319 12:31:35.360929 29612 scope.go:117] "RemoveContainer" containerID="6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703" Mar 19 12:31:35.361321 master-0 kubenswrapper[29612]: E0319 12:31:35.361267 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703\": container with ID starting with 6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703 not found: ID does not exist" containerID="6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703" Mar 19 12:31:35.361415 master-0 kubenswrapper[29612]: I0319 12:31:35.361339 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703"} err="failed to get container status \"6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703\": rpc error: code = NotFound desc = could not find container \"6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703\": container with ID starting with 6ea6dee94d2ff821bb609c665324913812c57cb34f98eb64bf0fb51ca50a4703 not found: ID does not exist" Mar 19 12:31:35.361415 master-0 kubenswrapper[29612]: I0319 12:31:35.361370 29612 scope.go:117] "RemoveContainer" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" Mar 19 12:31:35.377707 master-0 kubenswrapper[29612]: I0319 12:31:35.374507 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:35.396594 master-0 kubenswrapper[29612]: I0319 12:31:35.396527 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbtw4\" (UniqueName: \"kubernetes.io/projected/24df43e9-37cb-4632-9a19-5b602c2742f9-kube-api-access-qbtw4\") pod \"24df43e9-37cb-4632-9a19-5b602c2742f9\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " Mar 19 12:31:35.396846 master-0 kubenswrapper[29612]: I0319 12:31:35.396615 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-combined-ca-bundle\") pod \"24df43e9-37cb-4632-9a19-5b602c2742f9\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " Mar 19 12:31:35.396963 master-0 kubenswrapper[29612]: I0319 12:31:35.396942 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-config-data\") pod \"24df43e9-37cb-4632-9a19-5b602c2742f9\" (UID: \"24df43e9-37cb-4632-9a19-5b602c2742f9\") " Mar 19 12:31:35.400202 master-0 kubenswrapper[29612]: I0319 12:31:35.400155 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24df43e9-37cb-4632-9a19-5b602c2742f9-kube-api-access-qbtw4" (OuterVolumeSpecName: "kube-api-access-qbtw4") pod "24df43e9-37cb-4632-9a19-5b602c2742f9" (UID: "24df43e9-37cb-4632-9a19-5b602c2742f9"). InnerVolumeSpecName "kube-api-access-qbtw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.405886 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: E0319 12:31:35.406432 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-log" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.406448 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-log" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: E0319 12:31:35.406491 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24df43e9-37cb-4632-9a19-5b602c2742f9" containerName="nova-scheduler-scheduler" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.406498 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="24df43e9-37cb-4632-9a19-5b602c2742f9" containerName="nova-scheduler-scheduler" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: E0319 12:31:35.406511 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-metadata" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.406518 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-metadata" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.406772 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-log" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.406805 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" containerName="nova-metadata-metadata" Mar 19 12:31:35.407665 master-0 kubenswrapper[29612]: I0319 12:31:35.406830 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="24df43e9-37cb-4632-9a19-5b602c2742f9" containerName="nova-scheduler-scheduler" Mar 19 12:31:35.411694 master-0 kubenswrapper[29612]: I0319 12:31:35.408368 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:35.427950 master-0 kubenswrapper[29612]: I0319 12:31:35.427718 29612 scope.go:117] "RemoveContainer" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" Mar 19 12:31:35.427950 master-0 kubenswrapper[29612]: I0319 12:31:35.427928 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 12:31:35.428277 master-0 kubenswrapper[29612]: I0319 12:31:35.427966 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 12:31:35.432658 master-0 kubenswrapper[29612]: E0319 12:31:35.429164 29612 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057\": container with ID starting with 3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057 not found: ID does not exist" containerID="3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057" Mar 19 12:31:35.432658 master-0 kubenswrapper[29612]: I0319 12:31:35.429234 29612 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057"} err="failed to get container status \"3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057\": rpc error: code = NotFound desc = could not find container \"3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057\": container with ID starting with 3afdd3b3a5f02f7cd61895d60b971aabd4d37e358f15897ed632174b99aa1057 not found: ID does not exist" Mar 19 12:31:35.433195 master-0 kubenswrapper[29612]: I0319 12:31:35.432671 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:35.450851 master-0 kubenswrapper[29612]: I0319 12:31:35.449009 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24df43e9-37cb-4632-9a19-5b602c2742f9" (UID: "24df43e9-37cb-4632-9a19-5b602c2742f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:35.470387 master-0 kubenswrapper[29612]: I0319 12:31:35.470316 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-config-data" (OuterVolumeSpecName: "config-data") pod "24df43e9-37cb-4632-9a19-5b602c2742f9" (UID: "24df43e9-37cb-4632-9a19-5b602c2742f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:31:35.499359 master-0 kubenswrapper[29612]: I0319 12:31:35.499295 29612 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-config-data\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.499359 master-0 kubenswrapper[29612]: I0319 12:31:35.499337 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbtw4\" (UniqueName: \"kubernetes.io/projected/24df43e9-37cb-4632-9a19-5b602c2742f9-kube-api-access-qbtw4\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.499359 master-0 kubenswrapper[29612]: I0319 12:31:35.499347 29612 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24df43e9-37cb-4632-9a19-5b602c2742f9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 19 12:31:35.602195 master-0 kubenswrapper[29612]: I0319 12:31:35.602068 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-logs\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.602578 master-0 kubenswrapper[29612]: I0319 12:31:35.602548 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-config-data\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.602802 master-0 kubenswrapper[29612]: I0319 12:31:35.602775 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6879t\" (UniqueName: \"kubernetes.io/projected/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-kube-api-access-6879t\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.603052 master-0 kubenswrapper[29612]: I0319 12:31:35.603023 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.603232 master-0 kubenswrapper[29612]: I0319 12:31:35.603202 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.665463 master-0 kubenswrapper[29612]: I0319 12:31:35.663878 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.697901 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.706325 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-logs\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.706383 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-config-data\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.706426 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6879t\" (UniqueName: \"kubernetes.io/projected/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-kube-api-access-6879t\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.706475 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.706496 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.710935 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.712732 master-0 kubenswrapper[29612]: I0319 12:31:35.711313 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-logs\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.718882 master-0 kubenswrapper[29612]: I0319 12:31:35.716245 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:31:35.718882 master-0 kubenswrapper[29612]: I0319 12:31:35.717331 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-config-data\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.718882 master-0 kubenswrapper[29612]: I0319 12:31:35.718286 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:31:35.727775 master-0 kubenswrapper[29612]: I0319 12:31:35.720474 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.727775 master-0 kubenswrapper[29612]: I0319 12:31:35.720907 29612 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 12:31:35.731335 master-0 kubenswrapper[29612]: I0319 12:31:35.731288 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:31:35.733450 master-0 kubenswrapper[29612]: I0319 12:31:35.733376 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6879t\" (UniqueName: \"kubernetes.io/projected/7dd5a4c2-135c-4e27-b673-235f7b78a9e9-kube-api-access-6879t\") pod \"nova-metadata-0\" (UID: \"7dd5a4c2-135c-4e27-b673-235f7b78a9e9\") " pod="openstack/nova-metadata-0" Mar 19 12:31:35.808905 master-0 kubenswrapper[29612]: I0319 12:31:35.808845 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0294b896-dabe-40fc-bc2c-77093b959789-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.809145 master-0 kubenswrapper[29612]: I0319 12:31:35.808943 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5k2\" (UniqueName: \"kubernetes.io/projected/0294b896-dabe-40fc-bc2c-77093b959789-kube-api-access-9p5k2\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.809225 master-0 kubenswrapper[29612]: I0319 12:31:35.809158 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0294b896-dabe-40fc-bc2c-77093b959789-config-data\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.828739 master-0 kubenswrapper[29612]: I0319 12:31:35.828664 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 12:31:35.911756 master-0 kubenswrapper[29612]: I0319 12:31:35.911686 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0294b896-dabe-40fc-bc2c-77093b959789-config-data\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.912376 master-0 kubenswrapper[29612]: I0319 12:31:35.911820 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0294b896-dabe-40fc-bc2c-77093b959789-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.912376 master-0 kubenswrapper[29612]: I0319 12:31:35.911866 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5k2\" (UniqueName: \"kubernetes.io/projected/0294b896-dabe-40fc-bc2c-77093b959789-kube-api-access-9p5k2\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.915647 master-0 kubenswrapper[29612]: I0319 12:31:35.915603 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0294b896-dabe-40fc-bc2c-77093b959789-config-data\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.921619 master-0 kubenswrapper[29612]: I0319 12:31:35.921571 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0294b896-dabe-40fc-bc2c-77093b959789-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:35.929529 master-0 kubenswrapper[29612]: I0319 12:31:35.929440 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5k2\" (UniqueName: \"kubernetes.io/projected/0294b896-dabe-40fc-bc2c-77093b959789-kube-api-access-9p5k2\") pod \"nova-scheduler-0\" (UID: \"0294b896-dabe-40fc-bc2c-77093b959789\") " pod="openstack/nova-scheduler-0" Mar 19 12:31:36.114451 master-0 kubenswrapper[29612]: I0319 12:31:36.114377 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 12:31:36.399742 master-0 kubenswrapper[29612]: I0319 12:31:36.399611 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 12:31:36.416515 master-0 kubenswrapper[29612]: W0319 12:31:36.416467 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd5a4c2_135c_4e27_b673_235f7b78a9e9.slice/crio-71d65f83f42752b0ef0a97c7115dfa4fc195e7f31c97647894ca2c65f7db3c6a WatchSource:0}: Error finding container 71d65f83f42752b0ef0a97c7115dfa4fc195e7f31c97647894ca2c65f7db3c6a: Status 404 returned error can't find the container with id 71d65f83f42752b0ef0a97c7115dfa4fc195e7f31c97647894ca2c65f7db3c6a Mar 19 12:31:36.590469 master-0 kubenswrapper[29612]: I0319 12:31:36.590429 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 12:31:36.591155 master-0 kubenswrapper[29612]: W0319 12:31:36.591112 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0294b896_dabe_40fc_bc2c_77093b959789.slice/crio-ea7883600a36f3420ca4cd92e572a371345da87448630dc229f40bae34ceb67d WatchSource:0}: Error finding container ea7883600a36f3420ca4cd92e572a371345da87448630dc229f40bae34ceb67d: Status 404 returned error can't find the container with id ea7883600a36f3420ca4cd92e572a371345da87448630dc229f40bae34ceb67d Mar 19 12:31:36.936487 master-0 kubenswrapper[29612]: I0319 12:31:36.936363 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24df43e9-37cb-4632-9a19-5b602c2742f9" path="/var/lib/kubelet/pods/24df43e9-37cb-4632-9a19-5b602c2742f9/volumes" Mar 19 12:31:36.943659 master-0 kubenswrapper[29612]: I0319 12:31:36.939148 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef802a6a-c865-4c30-935a-d8c7f9c85e2d" path="/var/lib/kubelet/pods/ef802a6a-c865-4c30-935a-d8c7f9c85e2d/volumes" Mar 19 12:31:37.369151 master-0 kubenswrapper[29612]: I0319 12:31:37.369003 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0294b896-dabe-40fc-bc2c-77093b959789","Type":"ContainerStarted","Data":"c2c69f85fa604e247393e5bd665aea2cc818bce3ff8ada65cbc9b89aa6cc57c9"} Mar 19 12:31:37.369151 master-0 kubenswrapper[29612]: I0319 12:31:37.369056 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0294b896-dabe-40fc-bc2c-77093b959789","Type":"ContainerStarted","Data":"ea7883600a36f3420ca4cd92e572a371345da87448630dc229f40bae34ceb67d"} Mar 19 12:31:37.371762 master-0 kubenswrapper[29612]: I0319 12:31:37.371685 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7dd5a4c2-135c-4e27-b673-235f7b78a9e9","Type":"ContainerStarted","Data":"97ce0e79befee8cd97568e7fd3467de60f946e3140f47f2a1b6f736acd949de7"} Mar 19 12:31:37.371762 master-0 kubenswrapper[29612]: I0319 12:31:37.371742 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7dd5a4c2-135c-4e27-b673-235f7b78a9e9","Type":"ContainerStarted","Data":"583e5100e2aa2d358f486c864aaadb4ad95cddf1e9b46c55fe93f09beeb84497"} Mar 19 12:31:37.371762 master-0 kubenswrapper[29612]: I0319 12:31:37.371756 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7dd5a4c2-135c-4e27-b673-235f7b78a9e9","Type":"ContainerStarted","Data":"71d65f83f42752b0ef0a97c7115dfa4fc195e7f31c97647894ca2c65f7db3c6a"} Mar 19 12:31:37.427248 master-0 kubenswrapper[29612]: I0319 12:31:37.426700 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.426681358 podStartE2EDuration="2.426681358s" podCreationTimestamp="2026-03-19 12:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:37.400057851 +0000 UTC m=+1304.714226882" watchObservedRunningTime="2026-03-19 12:31:37.426681358 +0000 UTC m=+1304.740850379" Mar 19 12:31:37.438591 master-0 kubenswrapper[29612]: I0319 12:31:37.438321 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.438292558 podStartE2EDuration="2.438292558s" podCreationTimestamp="2026-03-19 12:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:31:37.427786699 +0000 UTC m=+1304.741955720" watchObservedRunningTime="2026-03-19 12:31:37.438292558 +0000 UTC m=+1304.752461599" Mar 19 12:31:40.732618 master-0 kubenswrapper[29612]: I0319 12:31:40.732545 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 12:31:40.732618 master-0 kubenswrapper[29612]: I0319 12:31:40.732612 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 12:31:41.115879 master-0 kubenswrapper[29612]: I0319 12:31:41.115733 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 12:31:41.739944 master-0 kubenswrapper[29612]: I0319 12:31:41.739863 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5596ed6f-50cb-4225-84d3-cde5b2a59f43" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.18:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:41.740599 master-0 kubenswrapper[29612]: I0319 12:31:41.739901 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5596ed6f-50cb-4225-84d3-cde5b2a59f43" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.18:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:45.829861 master-0 kubenswrapper[29612]: I0319 12:31:45.829688 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 12:31:45.829861 master-0 kubenswrapper[29612]: I0319 12:31:45.829764 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 12:31:46.115198 master-0 kubenswrapper[29612]: I0319 12:31:46.115060 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 12:31:46.150894 master-0 kubenswrapper[29612]: I0319 12:31:46.150807 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 12:31:46.531454 master-0 kubenswrapper[29612]: I0319 12:31:46.531390 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 12:31:46.846136 master-0 kubenswrapper[29612]: I0319 12:31:46.845974 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7dd5a4c2-135c-4e27-b673-235f7b78a9e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:46.846649 master-0 kubenswrapper[29612]: I0319 12:31:46.846418 29612 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7dd5a4c2-135c-4e27-b673-235f7b78a9e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 12:31:48.731734 master-0 kubenswrapper[29612]: I0319 12:31:48.731667 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 12:31:48.732263 master-0 kubenswrapper[29612]: I0319 12:31:48.731857 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 12:31:50.739399 master-0 kubenswrapper[29612]: I0319 12:31:50.739343 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 12:31:50.739932 master-0 kubenswrapper[29612]: I0319 12:31:50.739445 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 12:31:50.746114 master-0 kubenswrapper[29612]: I0319 12:31:50.746049 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 12:31:50.748550 master-0 kubenswrapper[29612]: I0319 12:31:50.748502 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 12:31:53.829112 master-0 kubenswrapper[29612]: I0319 12:31:53.829007 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 12:31:53.829653 master-0 kubenswrapper[29612]: I0319 12:31:53.829177 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 12:31:55.835572 master-0 kubenswrapper[29612]: I0319 12:31:55.835422 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 12:31:55.836423 master-0 kubenswrapper[29612]: I0319 12:31:55.835705 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 12:31:55.842409 master-0 kubenswrapper[29612]: I0319 12:31:55.842340 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 12:31:56.640449 master-0 kubenswrapper[29612]: I0319 12:31:56.640393 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 12:32:23.437021 master-0 kubenswrapper[29612]: I0319 12:32:23.436945 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-zm4xg"] Mar 19 12:32:23.437962 master-0 kubenswrapper[29612]: I0319 12:32:23.437219 29612 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" podUID="c3a5a568-eb2b-4a52-a621-c73f895a1e54" containerName="sushy-emulator" containerID="cri-o://680ec5360c00bf7ffb0bd302267c5a0bc607c5dd91ff24db48f5e132e7d2938d" gracePeriod=30 Mar 19 12:32:24.021784 master-0 kubenswrapper[29612]: I0319 12:32:24.021677 29612 generic.go:334] "Generic (PLEG): container finished" podID="c3a5a568-eb2b-4a52-a621-c73f895a1e54" containerID="680ec5360c00bf7ffb0bd302267c5a0bc607c5dd91ff24db48f5e132e7d2938d" exitCode=0 Mar 19 12:32:24.022049 master-0 kubenswrapper[29612]: I0319 12:32:24.022004 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" event={"ID":"c3a5a568-eb2b-4a52-a621-c73f895a1e54","Type":"ContainerDied","Data":"680ec5360c00bf7ffb0bd302267c5a0bc607c5dd91ff24db48f5e132e7d2938d"} Mar 19 12:32:24.276743 master-0 kubenswrapper[29612]: I0319 12:32:24.275495 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:32:24.388405 master-0 kubenswrapper[29612]: I0319 12:32:24.382695 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c3a5a568-eb2b-4a52-a621-c73f895a1e54-os-client-config\") pod \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " Mar 19 12:32:24.388405 master-0 kubenswrapper[29612]: I0319 12:32:24.383000 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27wnq\" (UniqueName: \"kubernetes.io/projected/c3a5a568-eb2b-4a52-a621-c73f895a1e54-kube-api-access-27wnq\") pod \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " Mar 19 12:32:24.388405 master-0 kubenswrapper[29612]: I0319 12:32:24.383155 29612 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/c3a5a568-eb2b-4a52-a621-c73f895a1e54-sushy-emulator-config\") pod \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\" (UID: \"c3a5a568-eb2b-4a52-a621-c73f895a1e54\") " Mar 19 12:32:24.388405 master-0 kubenswrapper[29612]: I0319 12:32:24.384553 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a5a568-eb2b-4a52-a621-c73f895a1e54-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "c3a5a568-eb2b-4a52-a621-c73f895a1e54" (UID: "c3a5a568-eb2b-4a52-a621-c73f895a1e54"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 12:32:24.388405 master-0 kubenswrapper[29612]: I0319 12:32:24.388140 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a5a568-eb2b-4a52-a621-c73f895a1e54-kube-api-access-27wnq" (OuterVolumeSpecName: "kube-api-access-27wnq") pod "c3a5a568-eb2b-4a52-a621-c73f895a1e54" (UID: "c3a5a568-eb2b-4a52-a621-c73f895a1e54"). InnerVolumeSpecName "kube-api-access-27wnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 12:32:24.393836 master-0 kubenswrapper[29612]: I0319 12:32:24.393783 29612 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a5a568-eb2b-4a52-a621-c73f895a1e54-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "c3a5a568-eb2b-4a52-a621-c73f895a1e54" (UID: "c3a5a568-eb2b-4a52-a621-c73f895a1e54"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 12:32:24.474662 master-0 kubenswrapper[29612]: I0319 12:32:24.474588 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-54b65fbdd6-jssrh"] Mar 19 12:32:24.475519 master-0 kubenswrapper[29612]: E0319 12:32:24.475483 29612 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a5a568-eb2b-4a52-a621-c73f895a1e54" containerName="sushy-emulator" Mar 19 12:32:24.475577 master-0 kubenswrapper[29612]: I0319 12:32:24.475520 29612 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a5a568-eb2b-4a52-a621-c73f895a1e54" containerName="sushy-emulator" Mar 19 12:32:24.475914 master-0 kubenswrapper[29612]: I0319 12:32:24.475891 29612 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a5a568-eb2b-4a52-a621-c73f895a1e54" containerName="sushy-emulator" Mar 19 12:32:24.477576 master-0 kubenswrapper[29612]: I0319 12:32:24.477145 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.486283 master-0 kubenswrapper[29612]: I0319 12:32:24.486237 29612 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27wnq\" (UniqueName: \"kubernetes.io/projected/c3a5a568-eb2b-4a52-a621-c73f895a1e54-kube-api-access-27wnq\") on node \"master-0\" DevicePath \"\"" Mar 19 12:32:24.486283 master-0 kubenswrapper[29612]: I0319 12:32:24.486276 29612 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/c3a5a568-eb2b-4a52-a621-c73f895a1e54-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:32:24.486283 master-0 kubenswrapper[29612]: I0319 12:32:24.486287 29612 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/c3a5a568-eb2b-4a52-a621-c73f895a1e54-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 19 12:32:24.488295 master-0 kubenswrapper[29612]: I0319 12:32:24.487945 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-54b65fbdd6-jssrh"] Mar 19 12:32:24.594409 master-0 kubenswrapper[29612]: I0319 12:32:24.594242 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82d99\" (UniqueName: \"kubernetes.io/projected/35239e8e-9569-4761-b7a7-f53a90105186-kube-api-access-82d99\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.595170 master-0 kubenswrapper[29612]: I0319 12:32:24.595105 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/35239e8e-9569-4761-b7a7-f53a90105186-os-client-config\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.595720 master-0 kubenswrapper[29612]: I0319 12:32:24.595606 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/35239e8e-9569-4761-b7a7-f53a90105186-sushy-emulator-config\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.698019 master-0 kubenswrapper[29612]: I0319 12:32:24.697952 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/35239e8e-9569-4761-b7a7-f53a90105186-os-client-config\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.698238 master-0 kubenswrapper[29612]: I0319 12:32:24.698146 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/35239e8e-9569-4761-b7a7-f53a90105186-sushy-emulator-config\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.698238 master-0 kubenswrapper[29612]: I0319 12:32:24.698230 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82d99\" (UniqueName: \"kubernetes.io/projected/35239e8e-9569-4761-b7a7-f53a90105186-kube-api-access-82d99\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.699498 master-0 kubenswrapper[29612]: I0319 12:32:24.699454 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/35239e8e-9569-4761-b7a7-f53a90105186-sushy-emulator-config\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.700965 master-0 kubenswrapper[29612]: I0319 12:32:24.700934 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/35239e8e-9569-4761-b7a7-f53a90105186-os-client-config\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.718002 master-0 kubenswrapper[29612]: I0319 12:32:24.717916 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82d99\" (UniqueName: \"kubernetes.io/projected/35239e8e-9569-4761-b7a7-f53a90105186-kube-api-access-82d99\") pod \"sushy-emulator-54b65fbdd6-jssrh\" (UID: \"35239e8e-9569-4761-b7a7-f53a90105186\") " pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:24.803920 master-0 kubenswrapper[29612]: I0319 12:32:24.803847 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:25.043006 master-0 kubenswrapper[29612]: I0319 12:32:25.041806 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" event={"ID":"c3a5a568-eb2b-4a52-a621-c73f895a1e54","Type":"ContainerDied","Data":"e638bc223b528f8645cd38de14595384736b37ea4db7dd4b2b1ff3174ce0bf53"} Mar 19 12:32:25.043006 master-0 kubenswrapper[29612]: I0319 12:32:25.041884 29612 scope.go:117] "RemoveContainer" containerID="680ec5360c00bf7ffb0bd302267c5a0bc607c5dd91ff24db48f5e132e7d2938d" Mar 19 12:32:25.043006 master-0 kubenswrapper[29612]: I0319 12:32:25.041901 29612 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-59477995f9-zm4xg" Mar 19 12:32:25.095722 master-0 kubenswrapper[29612]: I0319 12:32:25.095651 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-zm4xg"] Mar 19 12:32:25.114521 master-0 kubenswrapper[29612]: I0319 12:32:25.114453 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-59477995f9-zm4xg"] Mar 19 12:32:25.475906 master-0 kubenswrapper[29612]: I0319 12:32:25.475802 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-54b65fbdd6-jssrh"] Mar 19 12:32:25.487583 master-0 kubenswrapper[29612]: W0319 12:32:25.487076 29612 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35239e8e_9569_4761_b7a7_f53a90105186.slice/crio-df29dfd1880a38e4063a846fdf5165837abf7a62c7c21c1b1bd3b63ff6690336 WatchSource:0}: Error finding container df29dfd1880a38e4063a846fdf5165837abf7a62c7c21c1b1bd3b63ff6690336: Status 404 returned error can't find the container with id df29dfd1880a38e4063a846fdf5165837abf7a62c7c21c1b1bd3b63ff6690336 Mar 19 12:32:26.056589 master-0 kubenswrapper[29612]: I0319 12:32:26.056509 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" event={"ID":"35239e8e-9569-4761-b7a7-f53a90105186","Type":"ContainerStarted","Data":"0a65cd1657e8a873f0a1017d882aaabf48f1f8b0dc663b9e2a9bb6a42fcf5005"} Mar 19 12:32:26.056589 master-0 kubenswrapper[29612]: I0319 12:32:26.056585 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" event={"ID":"35239e8e-9569-4761-b7a7-f53a90105186","Type":"ContainerStarted","Data":"df29dfd1880a38e4063a846fdf5165837abf7a62c7c21c1b1bd3b63ff6690336"} Mar 19 12:32:26.088283 master-0 kubenswrapper[29612]: I0319 12:32:26.088199 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" podStartSLOduration=2.088129671 podStartE2EDuration="2.088129671s" podCreationTimestamp="2026-03-19 12:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 12:32:26.084027305 +0000 UTC m=+1353.398196336" watchObservedRunningTime="2026-03-19 12:32:26.088129671 +0000 UTC m=+1353.402298692" Mar 19 12:32:26.922038 master-0 kubenswrapper[29612]: I0319 12:32:26.921989 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a5a568-eb2b-4a52-a621-c73f895a1e54" path="/var/lib/kubelet/pods/c3a5a568-eb2b-4a52-a621-c73f895a1e54/volumes" Mar 19 12:32:34.805954 master-0 kubenswrapper[29612]: I0319 12:32:34.805868 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:34.806624 master-0 kubenswrapper[29612]: I0319 12:32:34.805974 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:34.816258 master-0 kubenswrapper[29612]: I0319 12:32:34.815601 29612 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:32:35.204833 master-0 kubenswrapper[29612]: I0319 12:32:35.204749 29612 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-54b65fbdd6-jssrh" Mar 19 12:33:57.099876 master-0 kubenswrapper[29612]: I0319 12:33:57.099807 29612 scope.go:117] "RemoveContainer" containerID="621df10343b235aa08e0998d8695d227dc1134913b8bac93661c221c6841cd76" Mar 19 12:33:57.129496 master-0 kubenswrapper[29612]: I0319 12:33:57.129427 29612 scope.go:117] "RemoveContainer" containerID="9a1fa36ffe5971d1c957e993c2930f289e8cd39157dc149a5b21a4f324d8c664" Mar 19 12:33:57.152894 master-0 kubenswrapper[29612]: I0319 12:33:57.149504 29612 scope.go:117] "RemoveContainer" containerID="ac5ff4a93d1172f5ff001524375c5ce0d6da81faf18150685844f991f1d708dc" Mar 19 12:34:57.265826 master-0 kubenswrapper[29612]: I0319 12:34:57.265613 29612 scope.go:117] "RemoveContainer" containerID="68365ce8740422547da5c4a14f661eac16d4123f9e69f7afc1e3391339e0449d" Mar 19 12:34:57.303255 master-0 kubenswrapper[29612]: I0319 12:34:57.303199 29612 scope.go:117] "RemoveContainer" containerID="f2ba96dbc94db110e57d4391401bed6845ddb34564b25c486a6feb5916b44459" Mar 19 12:34:57.335473 master-0 kubenswrapper[29612]: I0319 12:34:57.335409 29612 scope.go:117] "RemoveContainer" containerID="a6d967c396df37f31cfa704e7b953ff9e77b225cfdd5ab723e79e59e9135272b" Mar 19 12:36:57.457662 master-0 kubenswrapper[29612]: I0319 12:36:57.457100 29612 scope.go:117] "RemoveContainer" containerID="72547e021b6a18944aa444e1fc1d70a6067ef6defd511fcb32e780050b71ce9a" Mar 19 12:37:01.064447 master-0 kubenswrapper[29612]: I0319 12:37:01.064357 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ca05-account-create-update-cz8p9"] Mar 19 12:37:01.075455 master-0 kubenswrapper[29612]: I0319 12:37:01.075341 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ca05-account-create-update-cz8p9"] Mar 19 12:37:02.925609 master-0 kubenswrapper[29612]: I0319 12:37:02.925525 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d30f3c-9ff6-486e-b733-db361f4896c1" path="/var/lib/kubelet/pods/39d30f3c-9ff6-486e-b733-db361f4896c1/volumes" Mar 19 12:37:06.073079 master-0 kubenswrapper[29612]: I0319 12:37:06.072987 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-17c2-account-create-update-xg5kj"] Mar 19 12:37:06.090142 master-0 kubenswrapper[29612]: I0319 12:37:06.089853 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kb7ng"] Mar 19 12:37:06.119664 master-0 kubenswrapper[29612]: I0319 12:37:06.118160 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c0f4-account-create-update-vlptl"] Mar 19 12:37:06.133192 master-0 kubenswrapper[29612]: I0319 12:37:06.133116 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-69mx9"] Mar 19 12:37:06.145535 master-0 kubenswrapper[29612]: I0319 12:37:06.144795 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kb7ng"] Mar 19 12:37:06.159649 master-0 kubenswrapper[29612]: I0319 12:37:06.157849 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c0f4-account-create-update-vlptl"] Mar 19 12:37:06.172029 master-0 kubenswrapper[29612]: I0319 12:37:06.171915 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-17c2-account-create-update-xg5kj"] Mar 19 12:37:06.185471 master-0 kubenswrapper[29612]: I0319 12:37:06.185380 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-69mx9"] Mar 19 12:37:06.930202 master-0 kubenswrapper[29612]: I0319 12:37:06.930138 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12242f2c-55d1-422d-a775-5688b8dcb092" path="/var/lib/kubelet/pods/12242f2c-55d1-422d-a775-5688b8dcb092/volumes" Mar 19 12:37:06.932371 master-0 kubenswrapper[29612]: I0319 12:37:06.932296 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40581f33-2061-4f2d-b9b1-13396333b6a4" path="/var/lib/kubelet/pods/40581f33-2061-4f2d-b9b1-13396333b6a4/volumes" Mar 19 12:37:06.933714 master-0 kubenswrapper[29612]: I0319 12:37:06.933685 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d511e8-9536-451b-bf43-e5660fd5123a" path="/var/lib/kubelet/pods/68d511e8-9536-451b-bf43-e5660fd5123a/volumes" Mar 19 12:37:06.934970 master-0 kubenswrapper[29612]: I0319 12:37:06.934945 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c1e578-52c8-42f8-a392-a1913a2ee93e" path="/var/lib/kubelet/pods/d7c1e578-52c8-42f8-a392-a1913a2ee93e/volumes" Mar 19 12:37:09.502664 master-0 kubenswrapper[29612]: I0319 12:37:09.496299 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-q8b25"] Mar 19 12:37:09.507759 master-0 kubenswrapper[29612]: I0319 12:37:09.507392 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-q8b25"] Mar 19 12:37:10.919756 master-0 kubenswrapper[29612]: I0319 12:37:10.919499 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae27604-eeed-4c63-9655-46beecfe4168" path="/var/lib/kubelet/pods/dae27604-eeed-4c63-9655-46beecfe4168/volumes" Mar 19 12:37:12.037835 master-0 kubenswrapper[29612]: I0319 12:37:12.037747 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h45hg"] Mar 19 12:37:12.050301 master-0 kubenswrapper[29612]: I0319 12:37:12.050214 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-h45hg"] Mar 19 12:37:12.926127 master-0 kubenswrapper[29612]: I0319 12:37:12.926049 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0737ffd-e31d-44b8-a26f-6d8c0f222bd3" path="/var/lib/kubelet/pods/d0737ffd-e31d-44b8-a26f-6d8c0f222bd3/volumes" Mar 19 12:37:38.054744 master-0 kubenswrapper[29612]: I0319 12:37:38.054646 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7vbjj"] Mar 19 12:37:38.070952 master-0 kubenswrapper[29612]: I0319 12:37:38.070859 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7vbjj"] Mar 19 12:37:38.925227 master-0 kubenswrapper[29612]: I0319 12:37:38.925138 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a901b3-60ab-4fe0-ab8b-49d5b7c30b54" path="/var/lib/kubelet/pods/16a901b3-60ab-4fe0-ab8b-49d5b7c30b54/volumes" Mar 19 12:37:40.063473 master-0 kubenswrapper[29612]: I0319 12:37:40.063386 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a786-account-create-update-q6mp4"] Mar 19 12:37:40.084093 master-0 kubenswrapper[29612]: I0319 12:37:40.084016 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vjfrd"] Mar 19 12:37:40.094816 master-0 kubenswrapper[29612]: I0319 12:37:40.094699 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a786-account-create-update-q6mp4"] Mar 19 12:37:40.107757 master-0 kubenswrapper[29612]: I0319 12:37:40.105856 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7sxqt"] Mar 19 12:37:40.117472 master-0 kubenswrapper[29612]: I0319 12:37:40.117405 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vjfrd"] Mar 19 12:37:40.128130 master-0 kubenswrapper[29612]: I0319 12:37:40.128055 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9127-account-create-update-b86lg"] Mar 19 12:37:40.139478 master-0 kubenswrapper[29612]: I0319 12:37:40.139220 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7sxqt"] Mar 19 12:37:40.150845 master-0 kubenswrapper[29612]: I0319 12:37:40.150704 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9127-account-create-update-b86lg"] Mar 19 12:37:40.931664 master-0 kubenswrapper[29612]: I0319 12:37:40.931513 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08724d0d-69ad-4a95-98ae-0d0707bbaee8" path="/var/lib/kubelet/pods/08724d0d-69ad-4a95-98ae-0d0707bbaee8/volumes" Mar 19 12:37:40.932706 master-0 kubenswrapper[29612]: I0319 12:37:40.932371 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ccd0fb5-3e98-4209-ac70-c238a7610668" path="/var/lib/kubelet/pods/0ccd0fb5-3e98-4209-ac70-c238a7610668/volumes" Mar 19 12:37:40.933304 master-0 kubenswrapper[29612]: I0319 12:37:40.933128 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c524a81-fbea-432b-856d-1688012185e2" path="/var/lib/kubelet/pods/6c524a81-fbea-432b-856d-1688012185e2/volumes" Mar 19 12:37:40.933808 master-0 kubenswrapper[29612]: I0319 12:37:40.933780 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe770bd-975c-429f-9998-a4af416c5c1d" path="/var/lib/kubelet/pods/ffe770bd-975c-429f-9998-a4af416c5c1d/volumes" Mar 19 12:37:52.047353 master-0 kubenswrapper[29612]: I0319 12:37:52.047270 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-26vqt"] Mar 19 12:37:52.067615 master-0 kubenswrapper[29612]: I0319 12:37:52.067537 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-26vqt"] Mar 19 12:37:52.931995 master-0 kubenswrapper[29612]: I0319 12:37:52.930139 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef05b5e-8998-4c2c-a732-84cc8345df7c" path="/var/lib/kubelet/pods/2ef05b5e-8998-4c2c-a732-84cc8345df7c/volumes" Mar 19 12:37:57.571405 master-0 kubenswrapper[29612]: I0319 12:37:57.571333 29612 scope.go:117] "RemoveContainer" containerID="69f7eacc49fe4d7ed953c9332c2371d4cde490d77b8ed5f5dbd695f3a028c469" Mar 19 12:37:57.595593 master-0 kubenswrapper[29612]: I0319 12:37:57.595541 29612 scope.go:117] "RemoveContainer" containerID="dfec1fd2f12a1aa341116cba503de02b1017d97d721b665cfa4408fdab14caa6" Mar 19 12:37:57.617837 master-0 kubenswrapper[29612]: I0319 12:37:57.617789 29612 scope.go:117] "RemoveContainer" containerID="d0e34573d1f49c245d0f7bd53aec0a9fc6f853283035dc3404d4679fa5713f0d" Mar 19 12:37:57.651565 master-0 kubenswrapper[29612]: I0319 12:37:57.651508 29612 scope.go:117] "RemoveContainer" containerID="f6b4fba7b763c2ca76cb4bf0055f85e8b7b519e13383fec4c9c5fb142b31b6f2" Mar 19 12:37:57.672450 master-0 kubenswrapper[29612]: I0319 12:37:57.672396 29612 scope.go:117] "RemoveContainer" containerID="3923e303eb586757e5d02d43f6a54cbb32cc01575e765ed37602847d810d383c" Mar 19 12:37:57.697689 master-0 kubenswrapper[29612]: I0319 12:37:57.697663 29612 scope.go:117] "RemoveContainer" containerID="1a59905f71e35e381b7ee2ec16dc4ca9a2e2436f53d578e17009cd355daadedd" Mar 19 12:37:57.717718 master-0 kubenswrapper[29612]: I0319 12:37:57.717674 29612 scope.go:117] "RemoveContainer" containerID="f644e8c13a8d8a0b70d73a571330ff156be106f1b600bcbe8b33cfa1ae5aba3b" Mar 19 12:37:57.736951 master-0 kubenswrapper[29612]: I0319 12:37:57.736898 29612 scope.go:117] "RemoveContainer" containerID="1b5dad74840942e4694828977c1332ffaebc778ea206c3f987a9d1131f795c5f" Mar 19 12:37:57.759972 master-0 kubenswrapper[29612]: I0319 12:37:57.759906 29612 scope.go:117] "RemoveContainer" containerID="a8c719d8f4019912aca82739afdea74a1cb610df66db9772d57d93b5ae44b8a3" Mar 19 12:37:57.781503 master-0 kubenswrapper[29612]: I0319 12:37:57.781453 29612 scope.go:117] "RemoveContainer" containerID="1b56718cae6a03d93e8a0de7b6c68f76a0efae2d93240c2ce9671a2bc917170f" Mar 19 12:37:57.805857 master-0 kubenswrapper[29612]: I0319 12:37:57.805782 29612 scope.go:117] "RemoveContainer" containerID="47076e3f74fbbb246d96f5a88e4c23d83b00c7bcf966d21c99a1a6b54763360f" Mar 19 12:37:57.829287 master-0 kubenswrapper[29612]: I0319 12:37:57.829234 29612 scope.go:117] "RemoveContainer" containerID="68328a25efdc74fd0c62ffb7d6f01edc12680b7e6848ae041bd2b3c2c80fb763" Mar 19 12:37:57.850363 master-0 kubenswrapper[29612]: I0319 12:37:57.850319 29612 scope.go:117] "RemoveContainer" containerID="10ced7228a6cd4213fbbe0dd9fb136f6adae3a9234edf193798b6f211b5373e7" Mar 19 12:38:06.062833 master-0 kubenswrapper[29612]: I0319 12:38:06.062475 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-8ba2-account-create-update-24xjd"] Mar 19 12:38:06.080349 master-0 kubenswrapper[29612]: I0319 12:38:06.080258 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-8ba2-account-create-update-24xjd"] Mar 19 12:38:06.923855 master-0 kubenswrapper[29612]: I0319 12:38:06.923716 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27627ab5-718d-4b5c-ba01-9e7c7da19223" path="/var/lib/kubelet/pods/27627ab5-718d-4b5c-ba01-9e7c7da19223/volumes" Mar 19 12:38:07.069099 master-0 kubenswrapper[29612]: I0319 12:38:07.069014 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-6mnv4"] Mar 19 12:38:07.084897 master-0 kubenswrapper[29612]: I0319 12:38:07.084759 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-6mnv4"] Mar 19 12:38:08.945387 master-0 kubenswrapper[29612]: I0319 12:38:08.945301 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e893922b-55bb-4c15-84ed-4a005cffe0d2" path="/var/lib/kubelet/pods/e893922b-55bb-4c15-84ed-4a005cffe0d2/volumes" Mar 19 12:38:26.120558 master-0 kubenswrapper[29612]: I0319 12:38:26.120203 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-h9cqc"] Mar 19 12:38:26.145529 master-0 kubenswrapper[29612]: I0319 12:38:26.144260 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-h9cqc"] Mar 19 12:38:26.932171 master-0 kubenswrapper[29612]: I0319 12:38:26.932091 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a976515-4dc5-45d4-b047-92e92de29df6" path="/var/lib/kubelet/pods/3a976515-4dc5-45d4-b047-92e92de29df6/volumes" Mar 19 12:38:33.078778 master-0 kubenswrapper[29612]: I0319 12:38:33.078715 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mh455"] Mar 19 12:38:33.096908 master-0 kubenswrapper[29612]: I0319 12:38:33.096849 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mh455"] Mar 19 12:38:33.110467 master-0 kubenswrapper[29612]: I0319 12:38:33.110395 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8lnqx"] Mar 19 12:38:33.121495 master-0 kubenswrapper[29612]: I0319 12:38:33.121400 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8lnqx"] Mar 19 12:38:34.076350 master-0 kubenswrapper[29612]: I0319 12:38:34.076266 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-562cb-db-sync-dx9n4"] Mar 19 12:38:34.102755 master-0 kubenswrapper[29612]: I0319 12:38:34.101812 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-562cb-db-sync-dx9n4"] Mar 19 12:38:34.924450 master-0 kubenswrapper[29612]: I0319 12:38:34.924336 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d744992-9162-4633-9614-ed0c4ac81988" path="/var/lib/kubelet/pods/1d744992-9162-4633-9614-ed0c4ac81988/volumes" Mar 19 12:38:34.925795 master-0 kubenswrapper[29612]: I0319 12:38:34.925694 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8cc63c-1080-41d8-b789-bce87f57e603" path="/var/lib/kubelet/pods/9f8cc63c-1080-41d8-b789-bce87f57e603/volumes" Mar 19 12:38:34.926719 master-0 kubenswrapper[29612]: I0319 12:38:34.926675 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a533ad-81e0-4e73-b4a9-3dcda2fbad28" path="/var/lib/kubelet/pods/e5a533ad-81e0-4e73-b4a9-3dcda2fbad28/volumes" Mar 19 12:38:57.343683 master-0 kubenswrapper[29612]: I0319 12:38:57.343357 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-w5dht"] Mar 19 12:38:57.418811 master-0 kubenswrapper[29612]: I0319 12:38:57.418708 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-w5dht"] Mar 19 12:38:58.161801 master-0 kubenswrapper[29612]: I0319 12:38:58.160963 29612 scope.go:117] "RemoveContainer" containerID="ae453dc2ffdbfbe42a2be37ef294607e2cf3448b1125e0bb91b31f7b7af37f65" Mar 19 12:38:58.193247 master-0 kubenswrapper[29612]: I0319 12:38:58.193184 29612 scope.go:117] "RemoveContainer" containerID="bd6a2673bbdbfb4e635f19ec1be9ad0d0cf036b7f77b5a7cbf792de9b09caf54" Mar 19 12:38:58.226308 master-0 kubenswrapper[29612]: I0319 12:38:58.226256 29612 scope.go:117] "RemoveContainer" containerID="ad85012193de3e4c5962806b908621481620c36f819c9ddba6221ca17ea13d7e" Mar 19 12:38:58.255881 master-0 kubenswrapper[29612]: I0319 12:38:58.255827 29612 scope.go:117] "RemoveContainer" containerID="4131f6cb3c628d16ef964258b0626a8b9a7fb8240018c13b4575dc65b1912ca0" Mar 19 12:38:58.318082 master-0 kubenswrapper[29612]: I0319 12:38:58.318023 29612 scope.go:117] "RemoveContainer" containerID="53a8fa5477f3887d7fac256a3a45376952c72677c063ab4bc49ccea022f5cb99" Mar 19 12:38:58.386934 master-0 kubenswrapper[29612]: I0319 12:38:58.386845 29612 scope.go:117] "RemoveContainer" containerID="79cff36b6685fae615be3f2775f02a6c1d208b32ddea60b555a2d59e2b70797b" Mar 19 12:38:58.924703 master-0 kubenswrapper[29612]: I0319 12:38:58.924603 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd57bb3-eafc-41af-a7a0-794cd736adaa" path="/var/lib/kubelet/pods/6dd57bb3-eafc-41af-a7a0-794cd736adaa/volumes" Mar 19 12:39:04.049561 master-0 kubenswrapper[29612]: I0319 12:39:04.049473 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-td9zp"] Mar 19 12:39:04.064085 master-0 kubenswrapper[29612]: I0319 12:39:04.062063 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-td9zp"] Mar 19 12:39:04.921820 master-0 kubenswrapper[29612]: I0319 12:39:04.921747 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1ba2c7-2ca4-48e2-a467-0f795b64be2b" path="/var/lib/kubelet/pods/4c1ba2c7-2ca4-48e2-a467-0f795b64be2b/volumes" Mar 19 12:39:06.066988 master-0 kubenswrapper[29612]: I0319 12:39:06.066863 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-233a-account-create-update-8k2hj"] Mar 19 12:39:06.088596 master-0 kubenswrapper[29612]: I0319 12:39:06.088496 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-233a-account-create-update-8k2hj"] Mar 19 12:39:06.925097 master-0 kubenswrapper[29612]: I0319 12:39:06.924881 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3846413f-63e8-45f7-90a0-75230ff95190" path="/var/lib/kubelet/pods/3846413f-63e8-45f7-90a0-75230ff95190/volumes" Mar 19 12:39:30.067271 master-0 kubenswrapper[29612]: I0319 12:39:30.067190 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mncvw"] Mar 19 12:39:30.135472 master-0 kubenswrapper[29612]: I0319 12:39:30.135242 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mncvw"] Mar 19 12:39:30.930573 master-0 kubenswrapper[29612]: I0319 12:39:30.930432 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b120a0-2c7e-43de-982d-0edc209ebafa" path="/var/lib/kubelet/pods/82b120a0-2c7e-43de-982d-0edc209ebafa/volumes" Mar 19 12:39:40.571066 master-0 kubenswrapper[29612]: I0319 12:39:40.570979 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-rnrf5"] Mar 19 12:39:40.660217 master-0 kubenswrapper[29612]: I0319 12:39:40.660137 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6ksst"] Mar 19 12:39:40.676202 master-0 kubenswrapper[29612]: I0319 12:39:40.676132 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-rnrf5"] Mar 19 12:39:40.689491 master-0 kubenswrapper[29612]: I0319 12:39:40.689426 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6ksst"] Mar 19 12:39:40.921354 master-0 kubenswrapper[29612]: I0319 12:39:40.921292 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dff3f9d-f585-4561-a85e-51a6ba900d33" path="/var/lib/kubelet/pods/0dff3f9d-f585-4561-a85e-51a6ba900d33/volumes" Mar 19 12:39:40.922152 master-0 kubenswrapper[29612]: I0319 12:39:40.922122 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a" path="/var/lib/kubelet/pods/9b87f906-cdf4-47b9-bcb0-cf3d3a21b40a/volumes" Mar 19 12:39:41.279494 master-0 kubenswrapper[29612]: I0319 12:39:41.279345 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-c8pd5"] Mar 19 12:39:41.290720 master-0 kubenswrapper[29612]: I0319 12:39:41.290619 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7e01-account-create-update-9bxmr"] Mar 19 12:39:41.327038 master-0 kubenswrapper[29612]: I0319 12:39:41.326934 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-c8pd5"] Mar 19 12:39:41.596828 master-0 kubenswrapper[29612]: I0319 12:39:41.596657 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7e01-account-create-update-9bxmr"] Mar 19 12:39:42.374852 master-0 kubenswrapper[29612]: I0319 12:39:42.374770 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-facf-account-create-update-f5254"] Mar 19 12:39:42.387518 master-0 kubenswrapper[29612]: I0319 12:39:42.387440 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-facf-account-create-update-f5254"] Mar 19 12:39:42.458648 master-0 kubenswrapper[29612]: I0319 12:39:42.458566 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8d17-account-create-update-twzr4"] Mar 19 12:39:42.494727 master-0 kubenswrapper[29612]: I0319 12:39:42.494556 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8d17-account-create-update-twzr4"] Mar 19 12:39:42.931189 master-0 kubenswrapper[29612]: I0319 12:39:42.931133 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275eb2ad-4f11-47c4-93dd-b07ba6ef648e" path="/var/lib/kubelet/pods/275eb2ad-4f11-47c4-93dd-b07ba6ef648e/volumes" Mar 19 12:39:42.931804 master-0 kubenswrapper[29612]: I0319 12:39:42.931748 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dd055fb-058b-4a5b-ba29-b376e81cada7" path="/var/lib/kubelet/pods/2dd055fb-058b-4a5b-ba29-b376e81cada7/volumes" Mar 19 12:39:42.932328 master-0 kubenswrapper[29612]: I0319 12:39:42.932302 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57ef2db4-4e57-40b2-9293-8691f71495bd" path="/var/lib/kubelet/pods/57ef2db4-4e57-40b2-9293-8691f71495bd/volumes" Mar 19 12:39:42.933131 master-0 kubenswrapper[29612]: I0319 12:39:42.933105 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ca4272-7dc1-43fc-b90e-eddd21dadbd8" path="/var/lib/kubelet/pods/f1ca4272-7dc1-43fc-b90e-eddd21dadbd8/volumes" Mar 19 12:39:58.595145 master-0 kubenswrapper[29612]: I0319 12:39:58.595077 29612 scope.go:117] "RemoveContainer" containerID="e310f07b97bdde59a52c5d747219a39bdd939f67d5d91a2a7f77e5026b8edfab" Mar 19 12:39:58.622278 master-0 kubenswrapper[29612]: I0319 12:39:58.622215 29612 scope.go:117] "RemoveContainer" containerID="fb6f5124a98351dccb9c053494c0da9c03ad23fbb448d2c8170f4bd821777c57" Mar 19 12:39:58.647219 master-0 kubenswrapper[29612]: I0319 12:39:58.647143 29612 scope.go:117] "RemoveContainer" containerID="8966eec733035c3e385a44f3161f24e0d3a36a3cbc079c5cfe5d131649e9dd21" Mar 19 12:39:58.683533 master-0 kubenswrapper[29612]: I0319 12:39:58.683479 29612 scope.go:117] "RemoveContainer" containerID="baccff7a238191e9b496ab9f8ea06189590b4e026b36e8d87e59ff8ec07742c2" Mar 19 12:39:58.705516 master-0 kubenswrapper[29612]: I0319 12:39:58.705455 29612 scope.go:117] "RemoveContainer" containerID="d2fa3995bcd0e93d8313db25ff149ead5cd7e66149e130835cc71f793264989b" Mar 19 12:39:58.728058 master-0 kubenswrapper[29612]: I0319 12:39:58.727997 29612 scope.go:117] "RemoveContainer" containerID="d450d01262459a2f9f687bea1cbd3a45bacd3a7695b6dd2345aeeda304fbba22" Mar 19 12:39:58.748646 master-0 kubenswrapper[29612]: I0319 12:39:58.748579 29612 scope.go:117] "RemoveContainer" containerID="328bcc5d8256d53424c53e4145ff47b65a56a85e6b84d529374edb9b101388b5" Mar 19 12:39:58.783567 master-0 kubenswrapper[29612]: I0319 12:39:58.783459 29612 scope.go:117] "RemoveContainer" containerID="094ec1b0c8b53cedaa70bfce9b1acce163669311c988220e4e3b8c52cadd03a1" Mar 19 12:39:58.810755 master-0 kubenswrapper[29612]: I0319 12:39:58.810717 29612 scope.go:117] "RemoveContainer" containerID="6ec4a994a578bdb11b174bef7c0f6b9a964818981d668e7dd954b2e0e98fbbd0" Mar 19 12:39:58.834337 master-0 kubenswrapper[29612]: I0319 12:39:58.834295 29612 scope.go:117] "RemoveContainer" containerID="4ce9390bef8fe34b52179e6581d3377693475ba86b4c9202dfe8ff755dee5624" Mar 19 12:39:58.859032 master-0 kubenswrapper[29612]: I0319 12:39:58.858988 29612 scope.go:117] "RemoveContainer" containerID="e4b9a2b00f0d183bbaab95fb3a2754c533cc12558689257201027ab17a96e5a6" Mar 19 12:40:18.138790 master-0 kubenswrapper[29612]: I0319 12:40:18.138712 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2td5k"] Mar 19 12:40:18.155264 master-0 kubenswrapper[29612]: I0319 12:40:18.155188 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-2td5k"] Mar 19 12:40:18.923485 master-0 kubenswrapper[29612]: I0319 12:40:18.923406 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0" path="/var/lib/kubelet/pods/dc67e86b-b1d3-4ad1-984a-eb749ba2dcf0/volumes" Mar 19 12:40:48.279452 master-0 kubenswrapper[29612]: I0319 12:40:48.279369 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-82rhf"] Mar 19 12:40:48.307531 master-0 kubenswrapper[29612]: I0319 12:40:48.307462 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-82rhf"] Mar 19 12:40:48.925930 master-0 kubenswrapper[29612]: I0319 12:40:48.925859 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb6a533-34b0-46dc-bf9e-d612ac6bfdba" path="/var/lib/kubelet/pods/5cb6a533-34b0-46dc-bf9e-d612ac6bfdba/volumes" Mar 19 12:40:49.233944 master-0 kubenswrapper[29612]: I0319 12:40:49.233120 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bb8d6"] Mar 19 12:40:49.247182 master-0 kubenswrapper[29612]: I0319 12:40:49.246766 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bb8d6"] Mar 19 12:40:50.933535 master-0 kubenswrapper[29612]: I0319 12:40:50.924645 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d33dc3-9f8c-4950-98a3-532280705296" path="/var/lib/kubelet/pods/f3d33dc3-9f8c-4950-98a3-532280705296/volumes" Mar 19 12:40:59.085523 master-0 kubenswrapper[29612]: I0319 12:40:59.085361 29612 scope.go:117] "RemoveContainer" containerID="5fea02ff4fe0c8ee3f2cbe4c19c13b185f5919138864bb69844b4c446fac5556" Mar 19 12:40:59.131183 master-0 kubenswrapper[29612]: I0319 12:40:59.128182 29612 scope.go:117] "RemoveContainer" containerID="10e5d5ffd5c6df90a5bb673c28cc48402c45813674f49695945380ef5d9ac1cd" Mar 19 12:40:59.162540 master-0 kubenswrapper[29612]: I0319 12:40:59.162323 29612 scope.go:117] "RemoveContainer" containerID="8b6a726e51cb2d6fbcb7d9ab78407799f04167d090495c9fc9c5e36c32d0e732" Mar 19 12:41:27.068042 master-0 kubenswrapper[29612]: I0319 12:41:27.067964 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-w9c2p"] Mar 19 12:41:27.081941 master-0 kubenswrapper[29612]: I0319 12:41:27.081861 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-w9c2p"] Mar 19 12:41:28.924981 master-0 kubenswrapper[29612]: I0319 12:41:28.924840 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="261954af-1d89-4d6d-a269-7407af2c533c" path="/var/lib/kubelet/pods/261954af-1d89-4d6d-a269-7407af2c533c/volumes" Mar 19 12:41:29.036019 master-0 kubenswrapper[29612]: I0319 12:41:29.035949 29612 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xd7zk"] Mar 19 12:41:29.056661 master-0 kubenswrapper[29612]: I0319 12:41:29.053742 29612 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xd7zk"] Mar 19 12:41:30.922879 master-0 kubenswrapper[29612]: I0319 12:41:30.922817 29612 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a53c820-34b9-4c07-a724-063e01459e1f" path="/var/lib/kubelet/pods/2a53c820-34b9-4c07-a724-063e01459e1f/volumes" Mar 19 12:41:59.300880 master-0 kubenswrapper[29612]: I0319 12:41:59.300575 29612 scope.go:117] "RemoveContainer" containerID="370ab7fbe3f8f512c9f251de300927d9e115ff24902f3e159de30997fdc00c91" Mar 19 12:41:59.323836 master-0 kubenswrapper[29612]: I0319 12:41:59.323796 29612 scope.go:117] "RemoveContainer" containerID="e305787b78d3397b1225d76b87cdd972c30bd8e70f9f2cd96db58d319c106124" Mar 19 12:44:26.497568 master-0 kubenswrapper[29612]: I0319 12:44:26.497414 29612 trace.go:236] Trace[1657809023]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (19-Mar-2026 12:44:23.984) (total time: 2513ms): Mar 19 12:44:26.497568 master-0 kubenswrapper[29612]: Trace[1657809023]: [2.513156118s] [2.513156118s] END Mar 19 12:45:07.663953 master-0 kubenswrapper[29612]: I0319 12:45:07.663834 29612 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-97d484c8-dtsbv" podUID="9836e2b7-486b-4c01-ad9a-f63c9a0f08e3" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 19 12:48:00.813913 master-0 kubenswrapper[29612]: E0319 12:48:00.813847 29612 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:36576->192.168.32.10:37747: write tcp 192.168.32.10:36576->192.168.32.10:37747: write: connection reset by peer Mar 19 12:55:19.926126 master-0 kubenswrapper[29612]: I0319 12:55:19.926061 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4rgc7/must-gather-fbg4r"] Mar 19 12:55:20.036703 master-0 kubenswrapper[29612]: I0319 12:55:20.036143 29612 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4rgc7/must-gather-p74sk"] Mar 19 12:55:20.039677 master-0 kubenswrapper[29612]: I0319 12:55:20.037041 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.041105 master-0 kubenswrapper[29612]: I0319 12:55:20.038618 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.047082 master-0 kubenswrapper[29612]: I0319 12:55:20.046906 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4rgc7"/"openshift-service-ca.crt" Mar 19 12:55:20.047404 master-0 kubenswrapper[29612]: I0319 12:55:20.047299 29612 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4rgc7"/"kube-root-ca.crt" Mar 19 12:55:20.085142 master-0 kubenswrapper[29612]: I0319 12:55:20.085066 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rgc7/must-gather-fbg4r"] Mar 19 12:55:20.105973 master-0 kubenswrapper[29612]: I0319 12:55:20.105899 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rgc7/must-gather-p74sk"] Mar 19 12:55:20.184989 master-0 kubenswrapper[29612]: I0319 12:55:20.184839 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kvf\" (UniqueName: \"kubernetes.io/projected/130b0826-9ab8-4b21-9fa5-eb5b63de6a0f-kube-api-access-n2kvf\") pod \"must-gather-p74sk\" (UID: \"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f\") " pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.185322 master-0 kubenswrapper[29612]: I0319 12:55:20.185010 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c66094e-0fbc-40ca-8a78-d11ac527de08-must-gather-output\") pod \"must-gather-fbg4r\" (UID: \"3c66094e-0fbc-40ca-8a78-d11ac527de08\") " pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.185322 master-0 kubenswrapper[29612]: I0319 12:55:20.185126 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130b0826-9ab8-4b21-9fa5-eb5b63de6a0f-must-gather-output\") pod \"must-gather-p74sk\" (UID: \"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f\") " pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.186103 master-0 kubenswrapper[29612]: I0319 12:55:20.185444 29612 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdl8k\" (UniqueName: \"kubernetes.io/projected/3c66094e-0fbc-40ca-8a78-d11ac527de08-kube-api-access-wdl8k\") pod \"must-gather-fbg4r\" (UID: \"3c66094e-0fbc-40ca-8a78-d11ac527de08\") " pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.290947 master-0 kubenswrapper[29612]: I0319 12:55:20.290858 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdl8k\" (UniqueName: \"kubernetes.io/projected/3c66094e-0fbc-40ca-8a78-d11ac527de08-kube-api-access-wdl8k\") pod \"must-gather-fbg4r\" (UID: \"3c66094e-0fbc-40ca-8a78-d11ac527de08\") " pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.291286 master-0 kubenswrapper[29612]: I0319 12:55:20.291014 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kvf\" (UniqueName: \"kubernetes.io/projected/130b0826-9ab8-4b21-9fa5-eb5b63de6a0f-kube-api-access-n2kvf\") pod \"must-gather-p74sk\" (UID: \"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f\") " pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.291286 master-0 kubenswrapper[29612]: I0319 12:55:20.291129 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c66094e-0fbc-40ca-8a78-d11ac527de08-must-gather-output\") pod \"must-gather-fbg4r\" (UID: \"3c66094e-0fbc-40ca-8a78-d11ac527de08\") " pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.291286 master-0 kubenswrapper[29612]: I0319 12:55:20.291221 29612 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130b0826-9ab8-4b21-9fa5-eb5b63de6a0f-must-gather-output\") pod \"must-gather-p74sk\" (UID: \"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f\") " pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.292949 master-0 kubenswrapper[29612]: I0319 12:55:20.291920 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/130b0826-9ab8-4b21-9fa5-eb5b63de6a0f-must-gather-output\") pod \"must-gather-p74sk\" (UID: \"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f\") " pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.293141 master-0 kubenswrapper[29612]: I0319 12:55:20.293103 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c66094e-0fbc-40ca-8a78-d11ac527de08-must-gather-output\") pod \"must-gather-fbg4r\" (UID: \"3c66094e-0fbc-40ca-8a78-d11ac527de08\") " pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.320748 master-0 kubenswrapper[29612]: I0319 12:55:20.311489 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kvf\" (UniqueName: \"kubernetes.io/projected/130b0826-9ab8-4b21-9fa5-eb5b63de6a0f-kube-api-access-n2kvf\") pod \"must-gather-p74sk\" (UID: \"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f\") " pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:20.320748 master-0 kubenswrapper[29612]: I0319 12:55:20.311734 29612 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdl8k\" (UniqueName: \"kubernetes.io/projected/3c66094e-0fbc-40ca-8a78-d11ac527de08-kube-api-access-wdl8k\") pod \"must-gather-fbg4r\" (UID: \"3c66094e-0fbc-40ca-8a78-d11ac527de08\") " pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.383848 master-0 kubenswrapper[29612]: I0319 12:55:20.383766 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgc7/must-gather-fbg4r" Mar 19 12:55:20.398669 master-0 kubenswrapper[29612]: I0319 12:55:20.398602 29612 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4rgc7/must-gather-p74sk" Mar 19 12:55:21.067996 master-0 kubenswrapper[29612]: I0319 12:55:21.067825 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rgc7/must-gather-fbg4r"] Mar 19 12:55:21.082442 master-0 kubenswrapper[29612]: I0319 12:55:21.082375 29612 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4rgc7/must-gather-p74sk"] Mar 19 12:55:21.085152 master-0 kubenswrapper[29612]: I0319 12:55:21.085054 29612 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 12:55:21.886596 master-0 kubenswrapper[29612]: I0319 12:55:21.886515 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgc7/must-gather-p74sk" event={"ID":"130b0826-9ab8-4b21-9fa5-eb5b63de6a0f","Type":"ContainerStarted","Data":"db8dfc3ed0540da53ebfad0ba305aac113be8dbf643266baf0b3912e951bc567"} Mar 19 12:55:21.888169 master-0 kubenswrapper[29612]: I0319 12:55:21.888138 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgc7/must-gather-fbg4r" event={"ID":"3c66094e-0fbc-40ca-8a78-d11ac527de08","Type":"ContainerStarted","Data":"2912a6708dac330aa5c4caf491062056a604b998377d3e6e7a2c4fc4efdaf0fe"} Mar 19 12:55:24.030675 master-0 kubenswrapper[29612]: I0319 12:55:24.029904 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgc7/must-gather-fbg4r" event={"ID":"3c66094e-0fbc-40ca-8a78-d11ac527de08","Type":"ContainerStarted","Data":"d7f3f0fa709dd5ab5fb24883f197bf467a8e583ba50cf9c6bc9c5ebdcf31381f"} Mar 19 12:55:24.030675 master-0 kubenswrapper[29612]: I0319 12:55:24.029978 29612 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4rgc7/must-gather-fbg4r" event={"ID":"3c66094e-0fbc-40ca-8a78-d11ac527de08","Type":"ContainerStarted","Data":"e2b9951c7ce88ee29a74c54e1ba74801e461f7aac90b6fa5f7e3adfbf7639dfe"} Mar 19 12:55:24.119662 master-0 kubenswrapper[29612]: I0319 12:55:24.118207 29612 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4rgc7/must-gather-fbg4r" podStartSLOduration=3.710453453 podStartE2EDuration="5.118154479s" podCreationTimestamp="2026-03-19 12:55:19 +0000 UTC" firstStartedPulling="2026-03-19 12:55:21.084285516 +0000 UTC m=+2728.398454537" lastFinishedPulling="2026-03-19 12:55:22.491986542 +0000 UTC m=+2729.806155563" observedRunningTime="2026-03-19 12:55:24.094809936 +0000 UTC m=+2731.408978957" watchObservedRunningTime="2026-03-19 12:55:24.118154479 +0000 UTC m=+2731.432323500" Mar 19 12:55:27.494553 master-0 kubenswrapper[29612]: I0319 12:55:27.494496 29612 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7d58488df-4fpvq_f65be54c-d851-4daa-93a0-8a3947da5e26/cluster-version-operator/0.log"